stderr: /var/lib/ambari-agent/data/errors-757.txt Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 420, in NameNode().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 101, in start upgrade_suspended=params.upgrade_suspended, env=env) File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk return fn(*args, **kwargs) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 156, in namenode create_log_dir=True File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py", line 269, in service Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start namenode'' returned 1. starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.local.out stdout: /var/lib/ambari-agent/data/output-757.txt 2017-08-09 12:45:18,607 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258 2017-08-09 12:45:18,608 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0 2017-08-09 12:45:18,608 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-258', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2017-08-09 12:45:18,634 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '') 2017-08-09 12:45:18,635 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-258', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False} 2017-08-09 12:45:18,656 - checked_call returned (0, '') 2017-08-09 12:45:18,656 - Ensuring that hadoop has the correct symlink structure 2017-08-09 12:45:18,657 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-08-09 12:45:18,745 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258 2017-08-09 12:45:18,745 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0 2017-08-09 12:45:18,745 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-258', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2017-08-09 12:45:18,767 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '') 2017-08-09 12:45:18,768 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-258', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False} 2017-08-09 12:45:18,787 - checked_call returned (0, '') 2017-08-09 12:45:18,788 - Ensuring that hadoop has the correct symlink structure 2017-08-09 12:45:18,788 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-08-09 12:45:18,789 - Group['spark'] {} 2017-08-09 12:45:18,790 - Group['hadoop'] {} 2017-08-09 12:45:18,791 - Group['users'] {} 2017-08-09 12:45:18,791 - Group['knox'] {} 2017-08-09 12:45:18,791 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,792 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,792 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-08-09 12:45:18,793 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-08-09 12:45:18,794 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,794 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']} 2017-08-09 12:45:18,795 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,795 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,796 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,796 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,797 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,797 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,798 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,798 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']} 2017-08-09 12:45:18,799 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-08-09 12:45:18,801 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-08-09 12:45:18,805 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2017-08-09 12:45:18,806 - Group['hdfs'] {} 2017-08-09 12:45:18,806 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']} 2017-08-09 12:45:18,807 - FS Type: 2017-08-09 12:45:18,807 - Directory['/etc/hadoop'] {'mode': 0755} 2017-08-09 12:45:18,820 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2017-08-09 12:45:18,821 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2017-08-09 12:45:18,834 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'} 2017-08-09 12:45:18,839 - Skipping Execute[('setenforce', '0')] due to not_if 2017-08-09 12:45:18,840 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'} 2017-08-09 12:45:18,842 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'} 2017-08-09 12:45:18,842 - Changing owner for /var/run/hadoop from 1011 to root 2017-08-09 12:45:18,842 - Changing group for /var/run/hadoop from 501 to root 2017-08-09 12:45:18,843 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'} 2017-08-09 12:45:18,846 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'} 2017-08-09 12:45:18,848 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'} 2017-08-09 12:45:18,848 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644} 2017-08-09 12:45:18,858 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'} 2017-08-09 12:45:18,859 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755} 2017-08-09 12:45:18,860 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'} 2017-08-09 12:45:18,864 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'} 2017-08-09 12:45:18,868 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755} 2017-08-09 12:45:19,011 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258 2017-08-09 12:45:19,011 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0 2017-08-09 12:45:19,012 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-258', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2017-08-09 12:45:19,032 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '') 2017-08-09 12:45:19,033 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-258', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False} 2017-08-09 12:45:19,052 - checked_call returned (0, '') 2017-08-09 12:45:19,052 - Ensuring that hadoop has the correct symlink structure 2017-08-09 12:45:19,052 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-08-09 12:45:19,053 - Stack Feature Version Info: stack_version=2.4, version=2.4.2.0-258, current_cluster_version=2.4.2.0-258 -> 2.4.2.0-258 2017-08-09 12:45:19,055 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258 2017-08-09 12:45:19,055 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0 2017-08-09 12:45:19,055 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-258', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2017-08-09 12:45:19,078 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '') 2017-08-09 12:45:19,078 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.4.2.0-258', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False} 2017-08-09 12:45:19,100 - checked_call returned (0, '') 2017-08-09 12:45:19,101 - Ensuring that hadoop has the correct symlink structure 2017-08-09 12:45:19,101 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-08-09 12:45:19,106 - checked_call['rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] {'stderr': -1} 2017-08-09 12:45:19,155 - checked_call returned (0, '2.4.2.0-258', '') 2017-08-09 12:45:19,159 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'} 2017-08-09 12:45:19,165 - File['/etc/security/limits.d/hdfs.conf'] {'content': Template('hdfs.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644} 2017-08-09 12:45:19,165 - XmlConfig['hadoop-policy.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...} 2017-08-09 12:45:19,173 - Generating config: /usr/hdp/current/hadoop-client/conf/hadoop-policy.xml 2017-08-09 12:45:19,174 - File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-08-09 12:45:19,181 - XmlConfig['ssl-client.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...} 2017-08-09 12:45:19,189 - Generating config: /usr/hdp/current/hadoop-client/conf/ssl-client.xml 2017-08-09 12:45:19,189 - File['/usr/hdp/current/hadoop-client/conf/ssl-client.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-08-09 12:45:19,195 - Directory['/usr/hdp/current/hadoop-client/conf/secure'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'cd_access': 'a'} 2017-08-09 12:45:19,196 - XmlConfig['ssl-client.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf/secure', 'configuration_attributes': {}, 'configurations': ...} 2017-08-09 12:45:19,204 - Generating config: /usr/hdp/current/hadoop-client/conf/secure/ssl-client.xml 2017-08-09 12:45:19,205 - File['/usr/hdp/current/hadoop-client/conf/secure/ssl-client.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-08-09 12:45:19,210 - XmlConfig['ssl-server.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...} 2017-08-09 12:45:19,217 - Generating config: /usr/hdp/current/hadoop-client/conf/ssl-server.xml 2017-08-09 12:45:19,218 - File['/usr/hdp/current/hadoop-client/conf/ssl-server.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-08-09 12:45:19,224 - XmlConfig['hdfs-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {'final': {'dfs.support.append': 'true', 'dfs.datanode.data.dir': 'true', 'dfs.namenode.http-address': 'true', 'dfs.namenode.name.dir': 'true', 'dfs.webhdfs.enabled': 'true', 'dfs.datanode.failed.volumes.tolerated': 'true'}}, 'configurations': ...} 2017-08-09 12:45:19,233 - Generating config: /usr/hdp/current/hadoop-client/conf/hdfs-site.xml 2017-08-09 12:45:19,234 - File['/usr/hdp/current/hadoop-client/conf/hdfs-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-08-09 12:45:19,273 - XmlConfig['core-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'mode': 0644, 'configuration_attributes': {'final': {'fs.defaultFS': 'true'}}, 'owner': 'hdfs', 'configurations': ...} 2017-08-09 12:45:19,281 - Generating config: /usr/hdp/current/hadoop-client/conf/core-site.xml 2017-08-09 12:45:19,281 - File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2017-08-09 12:45:19,302 - File['/usr/hdp/current/hadoop-client/conf/slaves'] {'content': Template('slaves.j2'), 'owner': 'hdfs'} 2017-08-09 12:45:19,303 - Directory['/hadoop/hdfs/namenode'] {'owner': 'hdfs', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'} 2017-08-09 12:45:19,304 - Called service start with upgrade_type: None 2017-08-09 12:45:19,304 - Ranger admin not installed 2017-08-09 12:45:19,304 - /hadoop/hdfs/namenode/namenode-formatted/ exists. Namenode DFS already formatted 2017-08-09 12:45:19,304 - Directory['/hadoop/hdfs/namenode/namenode-formatted/'] {'create_parents': True} 2017-08-09 12:45:19,306 - File['/etc/hadoop/conf/dfs.exclude'] {'owner': 'hdfs', 'content': Template('exclude_hosts_list.j2'), 'group': 'hadoop'} 2017-08-09 12:45:19,306 - Options for start command are: 2017-08-09 12:45:19,307 - Directory['/var/run/hadoop'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0755} 2017-08-09 12:45:19,307 - Changing owner for /var/run/hadoop from 0 to hdfs 2017-08-09 12:45:19,307 - Changing group for /var/run/hadoop from 0 to hadoop 2017-08-09 12:45:19,307 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'group': 'hadoop', 'create_parents': True} 2017-08-09 12:45:19,307 - Directory['/var/log/hadoop/hdfs'] {'owner': 'hdfs', 'group': 'hadoop', 'create_parents': True} 2017-08-09 12:45:19,308 - File['/var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid'] {'action': ['delete'], 'not_if': 'ambari-sudo.sh -H -E test -f /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid && ambari-sudo.sh -H -E pgrep -F /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid'} 2017-08-09 12:45:19,318 - Deleting File['/var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid'] 2017-08-09 12:45:19,318 - Execute['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start namenode''] {'environment': {'HADOOP_LIBEXEC_DIR': '/usr/hdp/current/hadoop-client/libexec'}, 'not_if': 'ambari-sudo.sh -H -E test -f /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid && ambari-sudo.sh -H -E pgrep -F /var/run/hadoop/hdfs/hadoop-hdfs-namenode.pid'} 2017-08-09 12:45:23,386 - Execute['find /var/log/hadoop/hdfs -maxdepth 1 -type f -name '*' -exec echo '==> {} <==' \; -exec tail -n 40 {} \;'] {'logoutput': True, 'ignore_failures': True, 'user': 'hdfs'} ==> /var/log/hadoop/hdfs/hadoop-hdfs-datanode-USGVLHDP01.Coveris.log <== 2017-08-05 10:24:50,776 INFO datanode.DataNode (BPServiceActor.java:blockReport(364)) - Successfully sent block report 0x490382f6dcf08000, containing 1 storage report(s), of which we sent 1. The reports had 693 total blocks and used 1 RPC(s). This took 52 msec to generate and 290 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2017-08-05 10:24:50,900 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(696)) - Got finalize command for block pool BP-987764399-10.6.240.213-1500183275481 2017-08-05 14:28:14,769 INFO datanode.DirectoryScanner (DirectoryScanner.java:scan(505)) - BlockPool BP-987764399-10.6.240.213-1500183275481 Total blocks: 693, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2017-08-05 15:06:29,961 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(708)) - DatanodeCommand action: DNA_ACCESSKEYUPDATE 2017-08-05 15:06:30,089 INFO block.BlockTokenSecretManager (BlockTokenSecretManager.java:addKeys(193)) - Setting block keys 2017-08-05 16:24:51,805 INFO datanode.DataNode (BPServiceActor.java:blockReport(364)) - Successfully sent block report 0x490382f6dcf08001, containing 1 storage report(s), of which we sent 1. The reports had 693 total blocks and used 1 RPC(s). This took 82 msec to generate and 606 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2017-08-05 16:24:51,922 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(696)) - Got finalize command for block pool BP-987764399-10.6.240.213-1500183275481 2017-08-05 20:28:14,737 INFO datanode.DirectoryScanner (DirectoryScanner.java:scan(505)) - BlockPool BP-987764399-10.6.240.213-1500183275481 Total blocks: 693, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2017-08-05 22:24:52,252 INFO datanode.DataNode (BPServiceActor.java:blockReport(364)) - Successfully sent block report 0x490382f6dcf08002, containing 1 storage report(s), of which we sent 1. The reports had 693 total blocks and used 1 RPC(s). This took 67 msec to generate and 449 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2017-08-05 22:24:52,365 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(696)) - Got finalize command for block pool BP-987764399-10.6.240.213-1500183275481 2017-08-06 01:06:31,375 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(708)) - DatanodeCommand action: DNA_ACCESSKEYUPDATE 2017-08-06 01:06:31,747 INFO block.BlockTokenSecretManager (BlockTokenSecretManager.java:addKeys(193)) - Setting block keys 2017-08-06 02:28:15,664 INFO datanode.DirectoryScanner (DirectoryScanner.java:scan(505)) - BlockPool BP-987764399-10.6.240.213-1500183275481 Total blocks: 693, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2017-08-06 03:05:55,625 INFO util.JvmPauseMonitor (JvmPauseMonitor.java:run(196)) - Detected pause in JVM or host machine (eg GC): pause of approximately 1094ms No GCs detected 2017-08-06 03:09:42,145 INFO util.JvmPauseMonitor (JvmPauseMonitor.java:run(196)) - Detected pause in JVM or host machine (eg GC): pause of approximately 1139ms No GCs detected 2017-08-06 04:24:50,202 INFO datanode.DataNode (BPServiceActor.java:blockReport(364)) - Successfully sent block report 0x490382f6dcf08003, containing 1 storage report(s), of which we sent 1. The reports had 693 total blocks and used 1 RPC(s). This took 63 msec to generate and 318 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2017-08-06 04:24:50,289 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(696)) - Got finalize command for block pool BP-987764399-10.6.240.213-1500183275481 2017-08-06 07:46:04,966 INFO datanode.VolumeScanner (VolumeScanner.java:findNextUsableBlockIter(387)) - Now rescanning bpid BP-987764399-10.6.240.213-1500183275481 on volume /hadoop/hdfs/data, after more than 504 hour(s) 2017-08-06 08:04:50,698 INFO datanode.VolumeScanner (VolumeScanner.java:runLoop(533)) - VolumeScanner(/hadoop/hdfs/data, DS-50804907-30f5-48b5-a462-664082951776): finished scanning block pool BP-987764399-10.6.240.213-1500183275481 2017-08-06 08:04:50,777 INFO datanode.VolumeScanner (VolumeScanner.java:findNextUsableBlockIter(395)) - VolumeScanner(/hadoop/hdfs/data, DS-50804907-30f5-48b5-a462-664082951776): no suitable block pools found to scan. Waiting 1813274124 ms. 2017-08-06 08:28:13,729 INFO datanode.DirectoryScanner (DirectoryScanner.java:scan(505)) - BlockPool BP-987764399-10.6.240.213-1500183275481 Total blocks: 693, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2017-08-06 10:24:51,112 INFO datanode.DataNode (BPServiceActor.java:blockReport(364)) - Successfully sent block report 0x490382f6dcf08004, containing 1 storage report(s), of which we sent 1. The reports had 693 total blocks and used 1 RPC(s). This took 92 msec to generate and 467 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2017-08-06 10:24:51,257 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(696)) - Got finalize command for block pool BP-987764399-10.6.240.213-1500183275481 2017-08-06 11:06:32,547 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(708)) - DatanodeCommand action: DNA_ACCESSKEYUPDATE 2017-08-06 11:06:32,560 INFO block.BlockTokenSecretManager (BlockTokenSecretManager.java:addKeys(193)) - Setting block keys 2017-08-06 14:28:14,729 INFO datanode.DirectoryScanner (DirectoryScanner.java:scan(505)) - BlockPool BP-987764399-10.6.240.213-1500183275481 Total blocks: 693, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2017-08-06 14:38:42,986 INFO util.JvmPauseMonitor (JvmPauseMonitor.java:run(196)) - Detected pause in JVM or host machine (eg GC): pause of approximately 1012ms GC pool 'ParNew' had collection(s): count=1 time=1061ms 2017-08-06 16:24:51,548 INFO datanode.DataNode (BPServiceActor.java:blockReport(364)) - Successfully sent block report 0x490382f6dcf08005, containing 1 storage report(s), of which we sent 1. The reports had 693 total blocks and used 1 RPC(s). This took 62 msec to generate and 378 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5. 2017-08-06 16:24:51,647 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(696)) - Got finalize command for block pool BP-987764399-10.6.240.213-1500183275481 2017-08-06 20:28:14,823 INFO datanode.DirectoryScanner (DirectoryScanner.java:scan(505)) - BlockPool BP-987764399-10.6.240.213-1500183275481 Total blocks: 693, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0 2017-08-06 21:06:33,623 INFO datanode.DataNode (BPOfferService.java:processCommandFromActive(708)) - DatanodeCommand action: DNA_ACCESSKEYUPDATE 2017-08-06 21:06:33,670 INFO block.BlockTokenSecretManager (BlockTokenSecretManager.java:addKeys(193)) - Setting block keys 2017-08-06 21:33:27,997 ERROR datanode.DataNode (LogAdapter.java:error(71)) - RECEIVED SIGNAL 15: SIGTERM 2017-08-06 21:33:28,338 INFO datanode.DataNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at USGVLHDP01.Coveris/10.6.240.213 ************************************************************/ ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.log.1 <== 2017-08-06 12:02:53,765 INFO hdfs.StateChange (FSNamesystem.java:completeFile(3545)) - DIR* completeFile: /spark-history/.0967235a-2e2c-471a-8ae1-446e6c25cd64 is closed by DFSClient_NONMAPREDUCE_1961935643_1 2017-08-06 12:02:54,274 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:02:54,275 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:02:54,275 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:02:57,275 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:02:57,275 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:02:57,275 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:03:00,275 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:03:00,275 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:03:00,275 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:03:00,782 INFO namenode.FSNamesystem (FSNamesystem.java:rollEditLog(6035)) - Roll Edit Log from 10.6.240.213 2017-08-06 12:03:00,782 INFO namenode.FSEditLog (FSEditLog.java:rollEditLog(1202)) - Rolling edit logs 2017-08-06 12:03:00,782 INFO namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(1258)) - Ending log segment 475509 2017-08-06 12:03:00,782 INFO namenode.FSEditLog (FSEditLog.java:printStatistics(699)) - Number of transactions: 20 Total time for transactions(ms): 0 Number of transactions batched in Syncs: 0 Number of syncs: 20 SyncTimes(ms): 24 2017-08-06 12:03:00,783 INFO namenode.FSEditLog (FSEditLog.java:printStatistics(699)) - Number of transactions: 20 Total time for transactions(ms): 0 Number of transactions batched in Syncs: 0 Number of syncs: 21 SyncTimes(ms): 25 2017-08-06 12:03:00,784 INFO namenode.FileJournalManager (FileJournalManager.java:finalizeLogSegment(133)) - Finalizing edits file /hadoop/hdfs/namenode/current/edits_inprogress_0000000000000475509 -> /hadoop/hdfs/namenode/current/edits_0000000000000475509-0000000000000475528 2017-08-06 12:03:00,784 INFO namenode.FSEditLog (FSEditLog.java:startLogSegment(1218)) - Starting log segment at 475529 2017-08-06 12:03:03,275 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:03:03,275 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:03:03,276 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:03:03,770 INFO hdfs.StateChange (FSNamesystem.java:completeFile(3545)) - DIR* completeFile: /spark-history/.5f12b585-d093-4b47-8da6-8f856f8d2e94 is closed by DFSClient_NONMAPREDUCE_1961935643_1 2017-08-06 12:03:06,276 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:03:06,276 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:03:06,276 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:03:09,276 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:03:09,276 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:03:09,276 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:03:12,276 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:03:12,276 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:03:12,277 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:03:13,775 INFO hdfs.StateChange (FSNamesystem.java:completeFile(3545)) - DIR* completeFile: /spark-history/.3b972413-d5bc-40fe-906f-49f0c8197ad8 is closed by DFSClient_NONMAPREDUCE_1961935643_1 2017-08-06 12:03:15,277 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:03:15,277 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:03:15,277 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:03:18,277 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:03:18,277 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:03:18,277 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 12:03:21,277 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 12:03:21,277 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 12:03:21,278 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. ==> /var/log/hadoop/hdfs/hadoop-hdfs-datanode-USGVLHDP01.Coveris.out.1 <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hadoop-hdfs-secondarynamenode-USGVLHDP01.Coveris.out <== at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.downloadCheckpointFiles(SecondaryNameNode.java:443) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:540) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doWork(SecondaryNameNode.java:395) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$1.run(SecondaryNameNode.java:361) at org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(SecurityUtil.java:449) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.run(SecondaryNameNode.java:357) at java.lang.Thread.run(Thread.java:745) java.net.UnknownHostException: usgvlhdp01.coveris at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at sun.net.NetworkClient.doConnect(NetworkClient.java:175) at sun.net.www.http.HttpClient.openServer(HttpClient.java:432) at sun.net.www.http.HttpClient.openServer(HttpClient.java:527) at sun.net.www.http.HttpClient.(HttpClient.java:211) at sun.net.www.http.HttpClient.New(HttpClient.java:308) at sun.net.www.http.HttpClient.New(HttpClient.java:326) at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169) at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933) at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1513) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441) at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480) at org.apache.hadoop.hdfs.server.namenode.TransferFsImage.doGetUrl(TransferFsImage.java:412) at org.apache.hadoop.hdfs.server.namenode.TransferFsImage.getFileClient(TransferFsImage.java:397) at org.apache.hadoop.hdfs.server.namenode.TransferFsImage.downloadEditsToStorage(TransferFsImage.java:167) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$2.run(SecondaryNameNode.java:465) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$2.run(SecondaryNameNode.java:444) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.downloadCheckpointFiles(SecondaryNameNode.java:443) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:540) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doWork(SecondaryNameNode.java:395) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$1.run(SecondaryNameNode.java:361) at org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(SecurityUtil.java:449) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.run(SecondaryNameNode.java:357) at java.lang.Thread.run(Thread.java:745) ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-25 <== 2017-07-25 23:58:36,892 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.975c8ce9-58d9-4570-b2ea-faa52886f36a dst=null perm=null proto=rpc 2017-07-25 23:58:36,895 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.975c8ce9-58d9-4570-b2ea-faa52886f36a dst=null perm=null proto=rpc 2017-07-25 23:58:36,900 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:58:46,185 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-25 23:58:46,909 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.765577c7-e648-4e13-bb25-13c29fff2323 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-25 23:58:46,915 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.765577c7-e648-4e13-bb25-13c29fff2323 dst=null perm=null proto=rpc 2017-07-25 23:58:46,919 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.765577c7-e648-4e13-bb25-13c29fff2323 dst=null perm=null proto=rpc 2017-07-25 23:58:46,921 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:58:56,930 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.811fee3f-0b80-44dc-b876-6fc24b3cde02 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-25 23:58:56,939 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.811fee3f-0b80-44dc-b876-6fc24b3cde02 dst=null perm=null proto=rpc 2017-07-25 23:58:56,942 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.811fee3f-0b80-44dc-b876-6fc24b3cde02 dst=null perm=null proto=rpc 2017-07-25 23:58:56,945 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:58:58,768 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-25 23:59:06,949 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.2ee4f5e1-773e-4427-bf13-22371da932f2 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-25 23:59:06,954 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.2ee4f5e1-773e-4427-bf13-22371da932f2 dst=null perm=null proto=rpc 2017-07-25 23:59:06,955 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.2ee4f5e1-773e-4427-bf13-22371da932f2 dst=null perm=null proto=rpc 2017-07-25 23:59:06,958 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:59:16,965 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.4e2dac84-8995-4296-9a25-73e326a8dc8c dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-25 23:59:16,976 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.4e2dac84-8995-4296-9a25-73e326a8dc8c dst=null perm=null proto=rpc 2017-07-25 23:59:16,978 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.4e2dac84-8995-4296-9a25-73e326a8dc8c dst=null perm=null proto=rpc 2017-07-25 23:59:16,980 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:59:25,281 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-25 23:59:26,986 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.453f0e9c-30c7-49cc-95f2-5333645f6796 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-25 23:59:26,993 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.453f0e9c-30c7-49cc-95f2-5333645f6796 dst=null perm=null proto=rpc 2017-07-25 23:59:26,996 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.453f0e9c-30c7-49cc-95f2-5333645f6796 dst=null perm=null proto=rpc 2017-07-25 23:59:26,996 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:59:37,000 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.c34b8def-721a-4904-b233-e6fabe283b8e dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-25 23:59:37,004 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.c34b8def-721a-4904-b233-e6fabe283b8e dst=null perm=null proto=rpc 2017-07-25 23:59:37,006 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.c34b8def-721a-4904-b233-e6fabe283b8e dst=null perm=null proto=rpc 2017-07-25 23:59:37,007 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:59:46,171 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-25 23:59:47,015 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b5b26d20-9e57-4c72-b7de-0450400e1f2d dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-25 23:59:47,026 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b5b26d20-9e57-4c72-b7de-0450400e1f2d dst=null perm=null proto=rpc 2017-07-25 23:59:47,030 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b5b26d20-9e57-4c72-b7de-0450400e1f2d dst=null perm=null proto=rpc 2017-07-25 23:59:47,034 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:59:57,037 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.63e3c573-33b7-4ed5-a7f1-55604f0a392c dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-25 23:59:57,042 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.63e3c573-33b7-4ed5-a7f1-55604f0a392c dst=null perm=null proto=rpc 2017-07-25 23:59:57,051 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.63e3c573-33b7-4ed5-a7f1-55604f0a392c dst=null perm=null proto=rpc 2017-07-25 23:59:57,052 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-25 23:59:58,762 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/hdfs-audit.log <== 2017-08-06 21:31:51,395 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.3168606d-4c4d-4802-8aa9-8c3a8711ae5b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:31:51,398 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.3168606d-4c4d-4802-8aa9-8c3a8711ae5b dst=null perm=null proto=rpc 2017-08-06 21:31:51,399 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.3168606d-4c4d-4802-8aa9-8c3a8711ae5b dst=null perm=null proto=rpc 2017-08-06 21:31:51,399 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:32:01,400 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.0363ab0d-8b85-4441-8d35-b4c7cc880fc8 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:32:01,404 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.0363ab0d-8b85-4441-8d35-b4c7cc880fc8 dst=null perm=null proto=rpc 2017-08-06 21:32:01,405 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.0363ab0d-8b85-4441-8d35-b4c7cc880fc8 dst=null perm=null proto=rpc 2017-08-06 21:32:01,406 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:32:11,408 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.4f921d16-2572-4898-be14-29772efc7319 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:32:11,410 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.4f921d16-2572-4898-be14-29772efc7319 dst=null perm=null proto=rpc 2017-08-06 21:32:11,412 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.4f921d16-2572-4898-be14-29772efc7319 dst=null perm=null proto=rpc 2017-08-06 21:32:11,412 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:32:21,414 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.70f85fd2-a94d-48ba-8e10-965ce92cfb66 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:32:21,417 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.70f85fd2-a94d-48ba-8e10-965ce92cfb66 dst=null perm=null proto=rpc 2017-08-06 21:32:21,418 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.70f85fd2-a94d-48ba-8e10-965ce92cfb66 dst=null perm=null proto=rpc 2017-08-06 21:32:21,419 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:32:31,425 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.ed88ad9a-7f14-4afe-846e-d56001a32dc3 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:32:31,428 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.ed88ad9a-7f14-4afe-846e-d56001a32dc3 dst=null perm=null proto=rpc 2017-08-06 21:32:31,428 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.ed88ad9a-7f14-4afe-846e-d56001a32dc3 dst=null perm=null proto=rpc 2017-08-06 21:32:31,429 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:32:41,430 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.e811bfbb-b06e-498a-b394-94951ef89aec dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:32:41,432 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.e811bfbb-b06e-498a-b394-94951ef89aec dst=null perm=null proto=rpc 2017-08-06 21:32:41,434 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.e811bfbb-b06e-498a-b394-94951ef89aec dst=null perm=null proto=rpc 2017-08-06 21:32:41,435 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:32:51,437 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.aad4c908-9779-4038-86cb-b7e61948035d dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:32:51,438 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.aad4c908-9779-4038-86cb-b7e61948035d dst=null perm=null proto=rpc 2017-08-06 21:32:51,440 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.aad4c908-9779-4038-86cb-b7e61948035d dst=null perm=null proto=rpc 2017-08-06 21:32:51,440 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:33:01,442 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b473d6cb-293f-4bb3-828a-bb07f105fd46 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:33:01,444 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b473d6cb-293f-4bb3-828a-bb07f105fd46 dst=null perm=null proto=rpc 2017-08-06 21:33:01,445 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b473d6cb-293f-4bb3-828a-bb07f105fd46 dst=null perm=null proto=rpc 2017-08-06 21:33:01,445 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:33:11,447 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.5347cc07-e087-4e66-a99f-9dd778c5cfbb dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:33:11,448 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.5347cc07-e087-4e66-a99f-9dd778c5cfbb dst=null perm=null proto=rpc 2017-08-06 21:33:11,449 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.5347cc07-e087-4e66-a99f-9dd778c5cfbb dst=null perm=null proto=rpc 2017-08-06 21:33:11,450 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-06 21:33:21,451 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.751ca542-d201-4063-85df-ee072a0bf9de dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-06 21:33:21,453 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.751ca542-d201-4063-85df-ee072a0bf9de dst=null perm=null proto=rpc 2017-08-06 21:33:21,454 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.751ca542-d201-4063-85df-ee072a0bf9de dst=null perm=null proto=rpc 2017-08-06 21:33:21,454 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.out <== Jul 25, 2017 12:47:50 PM com.sun.jersey.api.core.PackagesResourceConfig init INFO: Scanning for root resource and provider classes in the packages: org.apache.hadoop.hdfs.server.namenode.web.resources org.apache.hadoop.hdfs.web.resources Jul 25, 2017 12:47:50 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses INFO: Root resource classes found: class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods Jul 25, 2017 12:47:50 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses INFO: Provider classes found: class org.apache.hadoop.hdfs.web.resources.ExceptionHandler class org.apache.hadoop.hdfs.web.resources.UserProvider Jul 25, 2017 12:47:50 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' Jul 25, 2017 12:47:51 PM com.sun.jersey.spi.inject.Errors processErrorMessages WARNING: The following warnings have been detected with resource and/or provider classes: WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method ==> /var/log/hadoop/hdfs/gc.log-201708071251 <== Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8) Memory: 4k page, physical 2053820k(348352k free), swap 8241148k(7924120k free) CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=1073741824 -XX:MaxHeapSize=1073741824 -XX:MaxNewSize=134217728 -XX:MaxTenuringThreshold=6 -XX:NewSize=134217728 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin/kill-secondary-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin/kill-secondary-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin/kill-secondary-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC Heap par new generation total 118016K, used 88191K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 84% used [0x00000000c0000000, 0x00000000c561fed0, 0x00000000c6680000) from space 13056K, 0% used [0x00000000c6680000, 0x00000000c6680000, 0x00000000c7340000) to space 13056K, 0% used [0x00000000c7340000, 0x00000000c7340000, 0x00000000c8000000) concurrent mark-sweep generation total 917504K, used 0K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 12860K, capacity 13088K, committed 13184K, reserved 1060864K class space used 1542K, capacity 1608K, committed 1664K, reserved 1048576K ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-30 <== 2017-07-30 23:58:38,149 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.397b1ddc-27dc-4ec2-884c-a30ddf956730 dst=null perm=null proto=rpc 2017-07-30 23:58:38,150 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:58:46,147 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-30 23:58:48,151 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.734fb3cc-d452-4bf1-bdc1-dcc19fd0bb76 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-30 23:58:48,153 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.734fb3cc-d452-4bf1-bdc1-dcc19fd0bb76 dst=null perm=null proto=rpc 2017-07-30 23:58:48,154 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.734fb3cc-d452-4bf1-bdc1-dcc19fd0bb76 dst=null perm=null proto=rpc 2017-07-30 23:58:48,154 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:58:58,156 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.643350f1-914d-4c23-bc91-105a06940df4 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-30 23:58:58,158 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.643350f1-914d-4c23-bc91-105a06940df4 dst=null perm=null proto=rpc 2017-07-30 23:58:58,159 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.643350f1-914d-4c23-bc91-105a06940df4 dst=null perm=null proto=rpc 2017-07-30 23:58:58,159 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:58:58,770 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-30 23:58:59,921 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-30 23:59:08,160 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.49a40b5d-b875-4dae-b2d9-138130abff3b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-30 23:59:08,162 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.49a40b5d-b875-4dae-b2d9-138130abff3b dst=null perm=null proto=rpc 2017-07-30 23:59:08,162 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.49a40b5d-b875-4dae-b2d9-138130abff3b dst=null perm=null proto=rpc 2017-07-30 23:59:08,163 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:59:18,164 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.2de306c5-edc6-4563-8167-4971a9435506 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-30 23:59:18,166 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.2de306c5-edc6-4563-8167-4971a9435506 dst=null perm=null proto=rpc 2017-07-30 23:59:18,167 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.2de306c5-edc6-4563-8167-4971a9435506 dst=null perm=null proto=rpc 2017-07-30 23:59:18,167 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:59:28,168 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.58438bc8-0d59-49d0-8bf4-f48680890bbb dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-30 23:59:28,171 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.58438bc8-0d59-49d0-8bf4-f48680890bbb dst=null perm=null proto=rpc 2017-07-30 23:59:28,172 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.58438bc8-0d59-49d0-8bf4-f48680890bbb dst=null perm=null proto=rpc 2017-07-30 23:59:28,173 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:59:38,174 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b3cc0e76-9211-4dd4-b249-35ca3c690cdc dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-30 23:59:38,176 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b3cc0e76-9211-4dd4-b249-35ca3c690cdc dst=null perm=null proto=rpc 2017-07-30 23:59:38,176 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b3cc0e76-9211-4dd4-b249-35ca3c690cdc dst=null perm=null proto=rpc 2017-07-30 23:59:38,177 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:59:46,136 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-30 23:59:48,178 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.ea03eee1-e4ff-47c1-ac77-9dfe1440e7b6 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-30 23:59:48,180 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.ea03eee1-e4ff-47c1-ac77-9dfe1440e7b6 dst=null perm=null proto=rpc 2017-07-30 23:59:48,181 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.ea03eee1-e4ff-47c1-ac77-9dfe1440e7b6 dst=null perm=null proto=rpc 2017-07-30 23:59:48,181 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:59:58,183 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.3b272c21-71f1-4ec0-a29f-40ea45b31c48 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-30 23:59:58,184 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.3b272c21-71f1-4ec0-a29f-40ea45b31c48 dst=null perm=null proto=rpc 2017-07-30 23:59:58,185 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.3b272c21-71f1-4ec0-a29f-40ea45b31c48 dst=null perm=null proto=rpc 2017-07-30 23:59:58,185 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-30 23:59:58,757 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-30 23:59:59,923 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc ==> /var/log/hadoop/hdfs/hadoop-hdfs-secondarynamenode-USGVLHDP01.Coveris.log <== 2017-08-06 21:31:30,087 INFO namenode.SecondaryNameNode (SecondaryNameNode.java:run(453)) - Image has not changed. Will not download image. 2017-08-06 21:31:30,091 INFO namenode.TransferFsImage (TransferFsImage.java:getFileClient(396)) - Opening connection to http://usgvlhdp01.coveris:50070/imagetransfer?getedit=1&startTxId=386017&endTxId=393086&storageInfo=-63:2013427911:0:CID-a04dcb87-5e2f-419e-b11e-db955dacfa53 2017-08-06 21:31:30,094 ERROR namenode.SecondaryNameNode (SecondaryNameNode.java:doWork(399)) - Exception in doCheckpoint java.net.UnknownHostException: usgvlhdp01.coveris at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at sun.net.NetworkClient.doConnect(NetworkClient.java:175) at sun.net.www.http.HttpClient.openServer(HttpClient.java:432) at sun.net.www.http.HttpClient.openServer(HttpClient.java:527) at sun.net.www.http.HttpClient.(HttpClient.java:211) at sun.net.www.http.HttpClient.New(HttpClient.java:308) at sun.net.www.http.HttpClient.New(HttpClient.java:326) at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169) at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933) at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1513) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441) at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480) at org.apache.hadoop.hdfs.server.namenode.TransferFsImage.doGetUrl(TransferFsImage.java:412) at org.apache.hadoop.hdfs.server.namenode.TransferFsImage.getFileClient(TransferFsImage.java:397) at org.apache.hadoop.hdfs.server.namenode.TransferFsImage.downloadEditsToStorage(TransferFsImage.java:167) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$2.run(SecondaryNameNode.java:465) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$2.run(SecondaryNameNode.java:444) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.downloadCheckpointFiles(SecondaryNameNode.java:443) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doCheckpoint(SecondaryNameNode.java:540) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.doWork(SecondaryNameNode.java:395) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$1.run(SecondaryNameNode.java:361) at org.apache.hadoop.security.SecurityUtil.doAsLoginUserOrFatal(SecurityUtil.java:449) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.run(SecondaryNameNode.java:357) at java.lang.Thread.run(Thread.java:745) 2017-08-06 21:32:09,927 ERROR namenode.SecondaryNameNode (LogAdapter.java:error(69)) - RECEIVED SIGNAL 15: SIGTERM 2017-08-06 21:32:10,280 INFO namenode.SecondaryNameNode (LogAdapter.java:info(45)) - SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down SecondaryNameNode at USGVLHDP01.Coveris/10.6.240.213 ************************************************************/ ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-26 <== 2017-07-26 23:58:31,890 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.3a7e744f-2f85-4216-b5dc-f37b87af78c9 dst=null perm=null proto=rpc 2017-07-26 23:58:31,890 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:58:41,893 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.cf048b39-1df4-4fc0-8fa1-be2706ff0473 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-26 23:58:41,895 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.cf048b39-1df4-4fc0-8fa1-be2706ff0473 dst=null perm=null proto=rpc 2017-07-26 23:58:41,899 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.cf048b39-1df4-4fc0-8fa1-be2706ff0473 dst=null perm=null proto=rpc 2017-07-26 23:58:41,900 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:58:46,170 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-26 23:58:51,901 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.44fdc2f1-6498-4fc8-aff4-f04bd1faaed7 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-26 23:58:51,903 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.44fdc2f1-6498-4fc8-aff4-f04bd1faaed7 dst=null perm=null proto=rpc 2017-07-26 23:58:51,904 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.44fdc2f1-6498-4fc8-aff4-f04bd1faaed7 dst=null perm=null proto=rpc 2017-07-26 23:58:51,904 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:58:52,212 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-26 23:58:58,754 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-26 23:59:01,906 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.50f1e9c3-1d20-4c41-9235-43872c6fa253 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-26 23:59:01,908 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.50f1e9c3-1d20-4c41-9235-43872c6fa253 dst=null perm=null proto=rpc 2017-07-26 23:59:01,909 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.50f1e9c3-1d20-4c41-9235-43872c6fa253 dst=null perm=null proto=rpc 2017-07-26 23:59:01,909 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:59:11,911 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.fde52572-70c6-46ea-b9e1-020fb8b8a271 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-26 23:59:11,914 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.fde52572-70c6-46ea-b9e1-020fb8b8a271 dst=null perm=null proto=rpc 2017-07-26 23:59:11,915 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.fde52572-70c6-46ea-b9e1-020fb8b8a271 dst=null perm=null proto=rpc 2017-07-26 23:59:11,915 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:59:21,917 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.41349a0c-17a8-49dd-9f82-8bd7e1010f60 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-26 23:59:21,919 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.41349a0c-17a8-49dd-9f82-8bd7e1010f60 dst=null perm=null proto=rpc 2017-07-26 23:59:21,920 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.41349a0c-17a8-49dd-9f82-8bd7e1010f60 dst=null perm=null proto=rpc 2017-07-26 23:59:21,921 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:59:31,922 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.e71cc16f-fbfd-41d1-a399-efb0ccb158b5 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-26 23:59:31,924 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.e71cc16f-fbfd-41d1-a399-efb0ccb158b5 dst=null perm=null proto=rpc 2017-07-26 23:59:31,925 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.e71cc16f-fbfd-41d1-a399-efb0ccb158b5 dst=null perm=null proto=rpc 2017-07-26 23:59:31,925 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:59:41,927 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.a8d5c7f2-b4e6-4c08-8c28-8b536145da9a dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-26 23:59:41,929 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.a8d5c7f2-b4e6-4c08-8c28-8b536145da9a dst=null perm=null proto=rpc 2017-07-26 23:59:41,929 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.a8d5c7f2-b4e6-4c08-8c28-8b536145da9a dst=null perm=null proto=rpc 2017-07-26 23:59:41,930 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:59:46,142 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-26 23:59:51,931 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.8c735d82-52bb-4467-a6cd-2a1bc644b07b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-26 23:59:51,934 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.8c735d82-52bb-4467-a6cd-2a1bc644b07b dst=null perm=null proto=rpc 2017-07-26 23:59:51,936 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.8c735d82-52bb-4467-a6cd-2a1bc644b07b dst=null perm=null proto=rpc 2017-07-26 23:59:51,936 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-26 23:59:52,226 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-26 23:59:58,759 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/gc.log-201708070025 <== Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8) Memory: 4k page, physical 2053820k(567768k free), swap 8241148k(7808768k free) CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=1073741824 -XX:MaxHeapSize=1073741824 -XX:MaxNewSize=134217728 -XX:MaxTenuringThreshold=6 -XX:NewSize=134217728 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC 2017-08-07T00:25:53.395-0400: 1.102: [GC (Allocation Failure) 2017-08-07T00:25:53.395-0400: 1.102: [ParNew: 104960K->11987K(118016K), 0.0190106 secs] 104960K->11987K(1035520K), 0.0191019 secs] [Times: user=0.05 sys=0.01, real=0.02 secs] Heap par new generation total 118016K, used 20244K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 7% used [0x00000000c0000000, 0x00000000c0810330, 0x00000000c6680000) from space 13056K, 91% used [0x00000000c7340000, 0x00000000c7ef4fd8, 0x00000000c8000000) to space 13056K, 0% used [0x00000000c6680000, 0x00000000c6680000, 0x00000000c7340000) concurrent mark-sweep generation total 917504K, used 0K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 16152K, capacity 16382K, committed 16768K, reserved 1064960K class space used 1963K, capacity 2061K, committed 2176K, reserved 1048576K ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.local.out.5 <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-28 <== 2017-07-28 23:58:32,686 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.1f04069f-0b32-47d7-aab5-a60b10636130 dst=null perm=null proto=rpc 2017-07-28 23:58:32,687 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.1f04069f-0b32-47d7-aab5-a60b10636130 dst=null perm=null proto=rpc 2017-07-28 23:58:32,687 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:58:42,688 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.575b94b7-4944-4994-a8ee-a50266a6e705 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-28 23:58:42,690 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.575b94b7-4944-4994-a8ee-a50266a6e705 dst=null perm=null proto=rpc 2017-07-28 23:58:42,691 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.575b94b7-4944-4994-a8ee-a50266a6e705 dst=null perm=null proto=rpc 2017-07-28 23:58:42,691 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:58:46,147 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-28 23:58:52,692 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.2380a548-2030-4f0a-b714-ed4620389823 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-28 23:58:52,694 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.2380a548-2030-4f0a-b714-ed4620389823 dst=null perm=null proto=rpc 2017-07-28 23:58:52,695 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.2380a548-2030-4f0a-b714-ed4620389823 dst=null perm=null proto=rpc 2017-07-28 23:58:52,695 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:58:58,750 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-28 23:59:02,696 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.0bc12efc-1612-450e-91d3-dcad040e4c05 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-28 23:59:02,698 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.0bc12efc-1612-450e-91d3-dcad040e4c05 dst=null perm=null proto=rpc 2017-07-28 23:59:02,699 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.0bc12efc-1612-450e-91d3-dcad040e4c05 dst=null perm=null proto=rpc 2017-07-28 23:59:02,699 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:59:12,700 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.d04b45d3-d7b6-4d7f-a13e-b6578df3fa86 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-28 23:59:12,702 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.d04b45d3-d7b6-4d7f-a13e-b6578df3fa86 dst=null perm=null proto=rpc 2017-07-28 23:59:12,703 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.d04b45d3-d7b6-4d7f-a13e-b6578df3fa86 dst=null perm=null proto=rpc 2017-07-28 23:59:12,703 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:59:22,705 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.c5f865a5-3d38-4855-8cf8-a089c7152264 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-28 23:59:22,706 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.c5f865a5-3d38-4855-8cf8-a089c7152264 dst=null perm=null proto=rpc 2017-07-28 23:59:22,707 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.c5f865a5-3d38-4855-8cf8-a089c7152264 dst=null perm=null proto=rpc 2017-07-28 23:59:22,707 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:59:23,305 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-28 23:59:32,709 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.4ed5b186-3ec3-41e7-a431-2d5ad132f057 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-28 23:59:32,711 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.4ed5b186-3ec3-41e7-a431-2d5ad132f057 dst=null perm=null proto=rpc 2017-07-28 23:59:32,711 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.4ed5b186-3ec3-41e7-a431-2d5ad132f057 dst=null perm=null proto=rpc 2017-07-28 23:59:32,712 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:59:42,713 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.c3b93ea6-3c8f-4f4e-8af8-6130e12a27f8 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-28 23:59:42,715 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.c3b93ea6-3c8f-4f4e-8af8-6130e12a27f8 dst=null perm=null proto=rpc 2017-07-28 23:59:42,716 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.c3b93ea6-3c8f-4f4e-8af8-6130e12a27f8 dst=null perm=null proto=rpc 2017-07-28 23:59:42,716 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:59:46,132 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-28 23:59:52,717 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.5bdd8af5-19b4-40d7-94ff-0c1d381de2b7 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-28 23:59:52,719 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.5bdd8af5-19b4-40d7-94ff-0c1d381de2b7 dst=null perm=null proto=rpc 2017-07-28 23:59:52,720 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.5bdd8af5-19b4-40d7-94ff-0c1d381de2b7 dst=null perm=null proto=rpc 2017-07-28 23:59:52,720 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-28 23:59:58,769 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.local.out.1 <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.local.out.4 <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-08-02 <== 2017-08-02 23:58:32,886 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.c546a020-8343-492f-ab6d-012e08ee140a dst=null perm=null proto=rpc 2017-08-02 23:58:32,887 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:58:38,520 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-08-02 23:58:42,888 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.1ee9c3b0-0758-45af-bd79-c50d3c99db1d dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-02 23:58:42,890 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.1ee9c3b0-0758-45af-bd79-c50d3c99db1d dst=null perm=null proto=rpc 2017-08-02 23:58:42,890 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.1ee9c3b0-0758-45af-bd79-c50d3c99db1d dst=null perm=null proto=rpc 2017-08-02 23:58:42,891 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:58:46,161 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-08-02 23:58:52,892 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.eddf6295-e3ec-4a41-9814-01764231fb3c dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-02 23:58:52,894 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.eddf6295-e3ec-4a41-9814-01764231fb3c dst=null perm=null proto=rpc 2017-08-02 23:58:52,895 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.eddf6295-e3ec-4a41-9814-01764231fb3c dst=null perm=null proto=rpc 2017-08-02 23:58:52,895 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:58:59,264 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-08-02 23:59:02,897 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.002278ba-bcca-4075-b91b-816b64451e6d dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-02 23:59:02,898 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.002278ba-bcca-4075-b91b-816b64451e6d dst=null perm=null proto=rpc 2017-08-02 23:59:02,899 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.002278ba-bcca-4075-b91b-816b64451e6d dst=null perm=null proto=rpc 2017-08-02 23:59:02,899 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:59:12,901 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.18c59291-89ca-477d-8fe0-9f6580681254 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-02 23:59:12,903 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.18c59291-89ca-477d-8fe0-9f6580681254 dst=null perm=null proto=rpc 2017-08-02 23:59:12,903 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.18c59291-89ca-477d-8fe0-9f6580681254 dst=null perm=null proto=rpc 2017-08-02 23:59:12,904 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:59:22,905 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.e7a78bbb-1eb2-4e72-9cb6-877dbc1c764f dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-02 23:59:22,907 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.e7a78bbb-1eb2-4e72-9cb6-877dbc1c764f dst=null perm=null proto=rpc 2017-08-02 23:59:22,907 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.e7a78bbb-1eb2-4e72-9cb6-877dbc1c764f dst=null perm=null proto=rpc 2017-08-02 23:59:22,908 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:59:32,909 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.0a9c8f64-6d23-45dc-b96c-c0a67defa53c dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-02 23:59:32,911 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.0a9c8f64-6d23-45dc-b96c-c0a67defa53c dst=null perm=null proto=rpc 2017-08-02 23:59:32,912 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.0a9c8f64-6d23-45dc-b96c-c0a67defa53c dst=null perm=null proto=rpc 2017-08-02 23:59:32,912 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:59:38,540 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-08-02 23:59:42,913 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.1c3bd81f-deb9-4201-9a52-d644292608bd dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-02 23:59:42,915 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.1c3bd81f-deb9-4201-9a52-d644292608bd dst=null perm=null proto=rpc 2017-08-02 23:59:42,916 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.1c3bd81f-deb9-4201-9a52-d644292608bd dst=null perm=null proto=rpc 2017-08-02 23:59:42,916 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:59:46,135 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-08-02 23:59:52,920 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.5f520bfc-aec8-4c62-a030-e8bdcb855a36 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-02 23:59:52,922 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.5f520bfc-aec8-4c62-a030-e8bdcb855a36 dst=null perm=null proto=rpc 2017-08-02 23:59:52,923 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.5f520bfc-aec8-4c62-a030-e8bdcb855a36 dst=null perm=null proto=rpc 2017-08-02 23:59:52,924 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-02 23:59:59,280 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/gc.log-201708071254 <== Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8) Memory: 4k page, physical 2053820k(389920k free), swap 8241148k(7925680k free) CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=1073741824 -XX:MaxHeapSize=1073741824 -XX:MaxNewSize=134217728 -XX:MaxTenuringThreshold=6 -XX:NewSize=134217728 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC 2017-08-07T12:54:39.438-0400: 1.012: [GC (Allocation Failure) 2017-08-07T12:54:39.438-0400: 1.012: [ParNew: 104960K->11608K(118016K), 0.0191172 secs] 104960K->11608K(1035520K), 0.0192128 secs] [Times: user=0.03 sys=0.00, real=0.02 secs] Heap par new generation total 118016K, used 21737K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 9% used [0x00000000c0000000, 0x00000000c09e4680, 0x00000000c6680000) from space 13056K, 88% us2017-08-07T12:54:35.882-0400: 3.278: [CMS-concurrent-abortable-preclean-start] CMS: abort preclean due to time 2017-08-07T12:54:40.926-0400: 8.323: [CMS-concurrent-abortable-preclean: 1.341/5.044 secs] [Times: user=1.34 sys=0.00, real=5.04 secs] 2017-08-07T12:54:40.926-0400: 8.323: [GC (CMS Final Remark) [YG occupancy: 93530 K (184320 K)]2017-08-07T12:54:40.926-0400: 8.323: [Rescan (parallel) , 0.0052262 secs]2017-08-07T12:54:40.932-0400: 8.328: [weak refs processing, 0.0000193 secs]2017-08-07T12:54:40.932-0400: 8.328: [class unloading, 0.0027556 secs]2017-08-07T12:54:40.934-0400: 8.331: [scrub symbol table, 0.0020956 secs]2017-08-07T12:54:40.937-0400: 8.333: [scrub string table, 0.0004025 secs][1 CMS-remark: 0K(843776K)] 93530K(1028096K), 0.0111028 secs] [Times: user=0.02 sys=0.00, real=0.02 secs] 2017-08-07T12:54:40.938-0400: 8.334: [CMS-concurrent-sweep-start] 2017-08-07T12:54:40.938-0400: 8.334: [CMS-concurrent-sweep: 0.000/0.000 secs] [Times: user=0.00 sys=0.00, real=0.00 secs] 2017-08-07T12:54:40.938-0400: 8.334: [CMS-concurrent-reset-start] 2017-08-07T12:54:40.940-0400: 8.336: [CMS-concurrent-reset: 0.002/0.002 secs] [Times: user=0.00 sys=0.00, real=0.00 secs] 2017-08-07T19:55:11.480-0400: 25238.876: [GC (Allocation Failure) 2017-08-07T19:55:11.480-0400: 25238.876: [ParNew: 177443K->9810K(184320K), 0.0265809 secs] 177443K->14652K(1028096K), 0.0267285 secs] [Times: user=0.09 sys=0.00, real=0.03 secs] 2017-08-08T09:49:10.004-0400: 75277.401: [GC (Allocation Failure) 2017-08-08T09:49:10.005-0400: 75277.401: [ParNew: 173650K->4618K(184320K), 0.0116735 secs] 178492K->9460K(1028096K), 0.0119051 secs] [Times: user=0.05 sys=0.00, real=0.01 secs] 2017-08-09T00:24:28.154-0400: 127795.550: [GC (Allocation Failure) 2017-08-09T00:24:28.154-0400: 127795.550: [ParNew: 168458K->4189K(184320K), 0.0080917 secs] 173300K->9031K(1028096K), 0.0081868 secs] [Times: user=0.03 sys=0.00, real=0.01 secs] ==> /var/log/hadoop/hdfs/gc.log-201708062134 <== Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8) Memory: 4k page, physical 2053820k(749672k free), swap 8241148k(7700620k free) CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=1073741824 -XX:MaxHeapSize=1073741824 -XX:MaxNewSize=134217728 -XX:MaxTenuringThreshold=6 -XX:NewSize=134217728 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC 2017-08-06T21:34:30.041-0400: 1.395: [GC (Allocation Failure) 2017-08-06T21:34:30.041-0400: 1.395: [ParNew: 104960K->11988K(118016K), 0.0472503 secs] 104960K->11988K(1035520K), 0.0473388 secs] [Times: user=0.11 sys=0.01, real=0.05 secs] Heap par new generation total 118016K, used 20174K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 7% used [0x00000000c0000000, 0x00000000c07fe6e8, 0x00000000c6680000) from space 13056K, 91% used [0x00000000c7340000, 0x00000000c7ef5128, 0x00000000c8000000) to space 13056K, 0% used [0x00000000c6680000, 0x00000000c6680000, 0x00000000c7340000) concurrent mark-sweep generation total 917504K, used 0K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 16151K, capacity 16370K, committed 16768K, reserved 1064960K class space used 1963K, capacity 2061K, committed 2176K, reserved 1048576K 64-0400: 8.711: [weak refs processing, 0.0000194 secs]2017-08-06T21:34:29.664-0400: 8.711: [class unloading, 0.0030016 secs]2017-08-06T21:34:29.667-0400: 8.714: [scrub symbol table, 0.0021789 secs]2017-08-06T21:34:29.669-0400: 8.716: [scrub string table, 0.0004160 secs][1 CMS-remark: 0K(843776K)] 93528K(1028096K), 0.0182304 secs] [Times: user=0.03 sys=0.00, real=0.02 secs] 2017-08-06T21:34:29.670-0400: 8.718: [CMS-concurrent-sweep-start] 2017-08-06T21:34:29.670-0400: 8.718: [CMS-concurrent-sweep: 0.000/0.000 secs] [Times: user=0.00 sys=0.00, real=0.00 secs] 2017-08-06T21:34:29.670-0400: 8.718: [CMS-concurrent-reset-start] 2017-08-06T21:34:29.672-0400: 8.720: [CMS-concurrent-reset: 0.002/0.002 secs] [Times: user=0.00 sys=0.00, real=0.00 secs] 2017-08-07T04:36:50.596-0400: 25349.643: [GC (Allocation Failure) 2017-08-07T04:36:50.596-0400: 25349.644: [ParNew: 177441K->10832K(184320K), 0.0300448 secs] 177441K->15673K(1028096K), 0.0302008 secs] [Times: user=0.10 sys=0.01, real=0.03 secs] Heap par new generation total 184320K, used 117778K [0x00000000c0000000, 0x00000000cc800000, 0x00000000cc800000) eden space 163840K, 65% used [0x00000000c0000000, 0x00000000c6870a60, 0x00000000ca000000) from space 20480K, 52% used [0x00000000ca000000, 0x00000000caa940b0, 0x00000000cb400000) to space 20480K, 0% used [0x00000000cb400000, 0x00000000cb400000, 0x00000000cc800000) concurrent mark-sweep generation total 843776K, used 4841K [0x00000000cc800000, 0x0000000100000000, 0x0000000100000000) Metaspace used 24212K, capacity 24490K, committed 24796K, reserved 1071104K class space used 2883K, capacity 2969K, committed 3044K, reserved 1048576K ==> /var/log/hadoop/hdfs/gc.log-201707242355 <== 2017-07-24T23:55:17.569-0400: 2.484: [GC (CMS Final Remark) [YG occupancy: 67671 K (118016 K)]2017-07-24T23:55:17.569-0400: 2.484: [Rescan (parallel) , 0.0099082 secs]2017-07-24T23:55:17.579-0400: 2.494: [weak refs processing, 0.0000505 secs]2017-07-24T23:55:17.579-0400: 2.494: [class unloading, 0.0036665 secs]2017-07-24T23:55:17.583-0400: 2.498: [scrub symbol table, 0.0023895 secs]2017-07-24T23:55:17.585-0400: 22017-07-24T23:56:13.198-0400: 67.664: [GC (Allocation Failure) 2017-07-24T23:56:13.198-0400: 67.664: [ParNew: 177445K->15770K(184320K), 0.0737544 secs] 177445K->20414K(1028096K), 0.0738329 secs] [Times: user=0.12 sys=0.01, real=0.07 secs] 2017-07-25T02:34:25.563-0400: 9560.031: [GC (Allocation Failure) 2017-07-25T02:34:25.573-0400: 9560.039: [ParNew: 179610K->7444K(184320K), 0.7040179 secs] 184254K->16891K(1028096K), 0.7146757 secs] [Times: user=0.36 sys=0.07, real=0.71 secs] 2017-07-25T06:50:25.573-0400: 24920.041: [GC (Allocation Failure) 2017-07-25T06:50:25.577-0400: 24920.043: [ParNew: 171284K->2083K(184320K), 0.8418509 secs] 180731K->11530K(1028096K), 0.8506488 secs] [Times: user=0.61 sys=0.07, real=0.86 secs] Heap par new generation total 184320K, used 85648K [0x00000000c0000000, 0x00000000cc800000, 0x00000000cc800000) eden space 163840K, 51% used [0x00000000c0000000, 0x00000000c519b2f0, 0x00000000ca000000) from space 20480K, 10% used [0x00000000ca000000, 0x00000000ca208e28, 0x00000000cb400000) to space 20480K, 0% used [0x00000000cb400000, 0x00000000cb400000, 0x00000000cc800000) concurrent mark-sweep generation total 843776K, used 9447K [0x00000000cc800000, 0x0000000100000000, 0x0000000100000000) Metaspace used 33327K, capacity 33588K, committed 34012K, reserved 1079296K class space used 3701K, capacity 3789K, committed 3812K, reserved 1048576K .00, real=0.07 secs] 2017-07-24T23:57:39.687-0400: 144.602: [CMS-concurrent-mark-start] 2017-07-24T23:57:39.727-0400: 144.642: [CMS-concurrent-mark: 0.040/0.040 secs] [Times: user=0.02 sys=0.01, real=0.04 secs] 2017-07-24T23:57:39.727-0400: 144.642: [CMS-concurrent-preclean-start] 2017-07-24T23:57:39.743-0400: 144.658: [CMS-concurrent-preclean: 0.016/0.016 secs] [Times: user=0.00 sys=0.00, real=0.02 secs] 2017-07-24T23:57:39.743-0400: 144.658: [CMS-concurrent-abortable-preclean-start] CMS: abort preclean due to time 2017-07-24T23:57:44.755-0400: 149.670: [CMS-concurrent-abortable-preclean: 0.806/5.012 secs] [Times: user=0.85 sys=0.01, real=5.01 secs] 2017-07-24T23:57:44.757-0400: 149.672: [GC (CMS Final Remark) [YG occupancy: 55300 K (118016 K)]2017-07-24T23:57:44.757-0400: 149.672: [Rescan (parallel) , 0.0109141 secs]2017-07-24T23:57:44.768-0400: 149.683: [weak refs processing, 0.0009559 secs]2017-07-24T23:57:44.769-0400: 149.685: [class unloading, 0.1551420 secs]2017-07-24T23:57:44.924-0400: 149.840: [scrub symbol table, 0.0545478 secs]2017-07-24T23:57:44.979-0400: 149.894: [scrub string table, 0.0010482 secs][1 CMS-remark: 25677K(917504K)] 80978K(1035520K), 0.2235525 secs] [Times: user=0.05 sys=0.01, real=0.23 secs] 2017-07-24T23:57:44.981-0400: 149.896: [CMS-concurrent-sweep-start] 2017-07-24T23:57:44.988-0400: 149.903: [CMS-concurrent-sweep: 0.007/0.007 secs] [Times: user=0.01 sys=0.00, real=0.00 secs] 2017-07-24T23:57:44.988-0400: 149.903: [CMS-concurrent-reset-start] 2017-07-24T23:57:45.012-0400: 149.928: [CMS-concurrent-reset: 0.025/0.025 secs] [Times: user=0.00 sys=0.01, real=0.03 secs] 2017-07-25T00:02:49.857-0400: 454.774: [GC (Allocation Failure) 2017-07-25T00:02:49.860-0400: 454.775: [ParNew: 112441K->8371K(118016K), 0.7153321 secs] 137719K->33649K(1035520K), 0.7179079 secs] [Times: user=1.75 sys=0.06, real=0.71 secs] 2017-07-25T00:10:14.163-0400: 899.079: [GC (Allocation Failure) 2017-07-25T00:10:14.166-0400: 899.082: [ParNew: 113331K->6014K(118016K), 0.4410796 secs] 138609K->32996K(1035520K), 0.4443170 secs] [Times: user=0.96 sys=0.07, real=0.45 secs] 2017-07-25T00:20:48.158-0400: 1533.075: [GC (Allocation Failure) 2017-07-25T00:20:48.168-0400: 1533.084: [ParNew: 110974K->6704K(118016K), 0.4269153 secs] 137956K->33686K(1035520K), 0.4378318 secs] [Times: user=1.00 sys=0.04, real=0.44 secs] 2017-07-25T00:33:52.888-0400: 2317.807: [GC (Allocation Failure) 2017-07-25T00:33:52.892-0400: 2317.808: [ParNew: 111664K->4934K(118016K), 0.2555563 secs] 138646K->31917K(1035520K), 0.2594654 secs] [Times: user=0.34 sys=0.05, real=0.26 secs] 2017-07-25T00:54:32.986-0400: 3557.902: [GC (Allocation Failure) 2017-07-25T00:54:32.987-0400: 3557.903: [ParNew: 109894K->4952K(118016K), 0.3964411 secs] 136877K->31934K(1035520K), 0.3989628 secs] [Times: user=1.07 sys=0.05, real=0.40 secs] 2017-07-25T01:33:19.481-0400: 5884.398: [GC (Allocation Failure) 2017-07-25T01:33:19.484-0400: 5884.400: [ParNew: 109912K->5287K(118016K), 0.5308591 secs] 136894K->32270K(1035520K), 0.5355366 secs] [Times: user=1.58 sys=0.07, real=0.54 secs] 2017-07-25T02:11:33.010-0400: 8177.927: [GC (Allocation Failure) 2017-07-25T02:11:33.013-0400: 8177.929: [ParNew: 110247K->2801K(118016K), 0.4215638 secs] 137230K->32840K(1035520K), 0.4269521 secs] [Times: user=0.74 sys=0.04, real=0.43 secs] 2017-07-25T02:51:17.308-0400: 10562.225: [GC (Allocation Failure) 2017-07-25T02:51:17.311-0400: 10562.227: [ParNew: 107761K->1297K(118016K), 0.4089205 secs] 137800K->31584K(1035520K), 0.4128656 secs] [Times: user=0.91 sys=0.06, real=0.41 secs] 2017-07-25T03:30:44.682-0400: 12929.599: [GC (Allocation Failure) 2017-07-25T03:30:44.685-0400: 12929.601: [ParNew: 106257K->1013K(118016K), 0.7768678 secs] 136544K->31346K(1035520K), 0.7818162 secs] [Times: user=2.32 sys=0.05, real=0.79 secs] 2017-07-25T04:10:25.580-0400: 15310.497: [GC (Allocation Failure) 2017-07-25T04:10:25.582-0400: 15310.498: [ParNew: 105973K->1281K(118016K), 0.4218541 secs] 136306K->31687K(1035520K), 0.4257190 secs] [Times: user=0.58 sys=0.05, real=0.43 secs] 2017-07-25T04:50:21.203-0400: 17706.126: [GC (Allocation Failure) 2017-07-25T04:50:21.214-0400: 17706.129: [ParNew: 106241K->709K(118016K), 0.6367510 secs] 136647K->31178K(1035520K), 0.6490803 secs] [Times: user=1.81 sys=0.05, real=0.65 secs] 2017-07-25T05:29:28.828-0400: 20053.744: [GC (Allocation Failure) 2017-07-25T05:29:28.830-0400: 20053.746: [ParNew: 105669K->1392K(118016K), 0.5321111 secs] 136138K->31899K(1035520K), 0.5371877 secs] [Times: user=0.77 sys=0.07, real=0.53 secs] 2017-07-25T06:08:35.107-0400: 22400.023: [GC (Allocation Failure) 2017-07-25T06:08:35.110-0400: 22400.025: [ParNew: 106352K->2048K(118016K), 0.5634386 secs] 136859K->32654K(1035520K), 0.5689174 secs] [Times: user=1.34 sys=0.05, real=0.57 secs] 2017-07-25T06:48:25.642-0400: 24790.559: [GC (Allocation Failure) 2017-07-25T06:48:25.651-0400: 24790.567: [ParNew: 107008K->2018K(118016K), 0.4723772 secs] 137614K->32634K(1035520K), 0.4838151 secs] [Times: user=0.92 sys=0.06, real=0.48 secs] 2017-07-25T07:28:25.565-0400: 27190.483: [GC (Allocation Failure) 2017-07-25T07:28:25.570-0400: 27190.485: [ParNew: 106978K->1936K(118016K), 0.5817596 secs] 137594K->32565K(1035520K), 0.5880520 secs] [Times: user=1.16 sys=0.06, real=0.59 secs] 2017-07-25T08:07:48.587-0400: 29553.505: [GC (Allocation Failure) 2017-07-25T08:07:48.591-0400: 29553.507: [ParNew: 106896K->1963K(118016K), 0.6374134 secs] 137525K->32601K(1035520K), 0.6421560 secs] [Times: user=1.61 sys=0.07, real=0.64 secs] 2017-07-25T08:47:31.142-0400: 31936.058: [GC (Allocation Failure) 2017-07-25T08:47:31.144-0400: 31936.060: [ParNew: 106923K->1799K(118016K), 0.5264137 secs] 137561K->32450K(1035520K), 0.5317650 secs] [Times: user=1.21 sys=0.04, real=0.54 secs] ==> /var/log/hadoop/hdfs/hadoop-hdfs-secondarynamenode-USGVLHDP01.Coveris.local.out <== Exception in thread "main" java.lang.IllegalArgumentException: java.net.UnknownHostException: usgvlhdp01.coveris at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:411) at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:311) at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:288) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:232) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.(SecondaryNameNode.java:192) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:671) Caused by: java.net.UnknownHostException: usgvlhdp01.coveris ... 6 more ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/SecurityAuth.audit <== 2017-07-28 12:09:59,131 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:10:58,869 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:10:58,974 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:11:58,854 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:11:58,910 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:12:58,891 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:12:58,981 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:32:02,172 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:32:03,627 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:32:58,947 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:32:59,035 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:33:58,896 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:33:59,017 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:34:58,906 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:34:59,021 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:35:58,881 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:35:58,994 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:36:58,890 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-28 12:36:59,007 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:13:59,117 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:14:59,504 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:14:59,518 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:14:59,856 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:15:58,977 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:15:58,987 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:15:59,184 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:16:58,889 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:16:58,896 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:16:58,985 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:17:58,902 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:17:58,912 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:17:59,097 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:18:58,920 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:18:59,016 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:20:58,968 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:21:59,546 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:22:58,892 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:23:58,896 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:25:00,286 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) 2017-07-29 08:25:58,992 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for yarn (auth:SIMPLE) ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.local.out <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hadoop-hdfs-datanode-USGVLHDP01.Coveris.local.log <== 2017-08-09 12:42:07,025 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:12,025 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:17,025 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:22,026 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:27,026 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:32,026 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:37,027 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:42,027 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:47,027 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:52,028 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:42:57,028 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:02,028 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:07,029 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:12,029 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:17,029 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:22,029 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:27,030 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:32,030 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:37,030 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:42,030 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:47,031 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:52,031 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:43:57,031 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:02,032 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:07,032 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:12,032 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:17,033 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:22,033 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:27,033 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:32,033 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:37,034 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:42,034 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:47,034 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:52,035 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:44:57,035 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:45:02,035 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:45:07,035 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:45:12,036 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:45:17,036 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 2017-08-09 12:45:22,036 WARN datanode.DataNode (BPServiceActor.java:retrieveNamespaceInfo(196)) - Problem connecting to server: usgvlhdp01.coveris:8020 ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-29 <== 2017-07-29 23:58:38,317 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.ce7b8eeb-df74-4060-8472-8bcb4401fad3 dst=null perm=null proto=rpc 2017-07-29 23:58:38,317 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:58:46,125 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-29 23:58:48,319 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.f186bdf5-4234-4167-a264-6cded43a1aa3 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-29 23:58:48,320 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.f186bdf5-4234-4167-a264-6cded43a1aa3 dst=null perm=null proto=rpc 2017-07-29 23:58:48,321 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.f186bdf5-4234-4167-a264-6cded43a1aa3 dst=null perm=null proto=rpc 2017-07-29 23:58:48,322 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:58:50,119 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-29 23:58:58,323 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b533ff4a-dac8-47c6-9555-0415cfe4a170 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-29 23:58:58,325 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b533ff4a-dac8-47c6-9555-0415cfe4a170 dst=null perm=null proto=rpc 2017-07-29 23:58:58,326 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b533ff4a-dac8-47c6-9555-0415cfe4a170 dst=null perm=null proto=rpc 2017-07-29 23:58:58,326 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:58:58,753 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-29 23:59:08,327 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.19442083-a3d7-414a-9a48-abef11b6e822 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-29 23:59:08,330 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.19442083-a3d7-414a-9a48-abef11b6e822 dst=null perm=null proto=rpc 2017-07-29 23:59:08,330 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.19442083-a3d7-414a-9a48-abef11b6e822 dst=null perm=null proto=rpc 2017-07-29 23:59:08,331 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:59:18,332 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.076b4a25-853d-418d-8385-79e5583223a9 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-29 23:59:18,334 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.076b4a25-853d-418d-8385-79e5583223a9 dst=null perm=null proto=rpc 2017-07-29 23:59:18,334 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.076b4a25-853d-418d-8385-79e5583223a9 dst=null perm=null proto=rpc 2017-07-29 23:59:18,335 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:59:28,336 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.9a144359-7bc2-4c10-a5e4-3f07d9e154e1 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-29 23:59:28,338 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.9a144359-7bc2-4c10-a5e4-3f07d9e154e1 dst=null perm=null proto=rpc 2017-07-29 23:59:28,339 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.9a144359-7bc2-4c10-a5e4-3f07d9e154e1 dst=null perm=null proto=rpc 2017-07-29 23:59:28,339 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:59:38,341 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.8e966f50-e5ef-43ae-bcf1-0fd3da806c9b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-29 23:59:38,342 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.8e966f50-e5ef-43ae-bcf1-0fd3da806c9b dst=null perm=null proto=rpc 2017-07-29 23:59:38,343 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.8e966f50-e5ef-43ae-bcf1-0fd3da806c9b dst=null perm=null proto=rpc 2017-07-29 23:59:38,345 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:59:46,107 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-29 23:59:48,346 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.7ba86d4f-da84-46fe-ab60-57623ee5306a dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-29 23:59:48,348 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.7ba86d4f-da84-46fe-ab60-57623ee5306a dst=null perm=null proto=rpc 2017-07-29 23:59:48,349 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.7ba86d4f-da84-46fe-ab60-57623ee5306a dst=null perm=null proto=rpc 2017-07-29 23:59:48,349 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:59:50,122 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-29 23:59:58,351 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.817144fb-2af2-4326-ae00-0b00697abd1c dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-29 23:59:58,352 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.817144fb-2af2-4326-ae00-0b00697abd1c dst=null perm=null proto=rpc 2017-07-29 23:59:58,353 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.817144fb-2af2-4326-ae00-0b00697abd1c dst=null perm=null proto=rpc 2017-07-29 23:59:58,353 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-29 23:59:58,762 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-08-01 <== 2017-08-01 23:58:35,788 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.931d8b5e-b245-45db-bea7-fbef068c9c98 dst=null perm=null proto=rpc 2017-08-01 23:58:35,789 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.931d8b5e-b245-45db-bea7-fbef068c9c98 dst=null perm=null proto=rpc 2017-08-01 23:58:35,790 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:58:45,794 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.11163873-2569-47d0-bfc4-55623d334b05 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-01 23:58:45,796 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.11163873-2569-47d0-bfc4-55623d334b05 dst=null perm=null proto=rpc 2017-08-01 23:58:45,797 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.11163873-2569-47d0-bfc4-55623d334b05 dst=null perm=null proto=rpc 2017-08-01 23:58:45,798 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:58:46,134 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-08-01 23:58:55,799 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.46d51cea-4d01-424a-aae9-c8f6a79810c2 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-01 23:58:55,801 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.46d51cea-4d01-424a-aae9-c8f6a79810c2 dst=null perm=null proto=rpc 2017-08-01 23:58:55,802 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.46d51cea-4d01-424a-aae9-c8f6a79810c2 dst=null perm=null proto=rpc 2017-08-01 23:58:55,802 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:58:58,752 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-08-01 23:59:05,804 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.a4d7bbde-34a9-4e23-8cc2-dd0acb6fd160 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-01 23:59:05,806 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.a4d7bbde-34a9-4e23-8cc2-dd0acb6fd160 dst=null perm=null proto=rpc 2017-08-01 23:59:05,807 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.a4d7bbde-34a9-4e23-8cc2-dd0acb6fd160 dst=null perm=null proto=rpc 2017-08-01 23:59:05,808 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:59:15,809 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.2bf27a4a-bf59-4033-8ce0-5224468baa54 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-01 23:59:15,811 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.2bf27a4a-bf59-4033-8ce0-5224468baa54 dst=null perm=null proto=rpc 2017-08-01 23:59:15,811 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.2bf27a4a-bf59-4033-8ce0-5224468baa54 dst=null perm=null proto=rpc 2017-08-01 23:59:15,812 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:59:23,791 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-08-01 23:59:25,813 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.2eafea07-be8a-4f7c-acf8-f91e85c5ee24 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-01 23:59:25,814 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.2eafea07-be8a-4f7c-acf8-f91e85c5ee24 dst=null perm=null proto=rpc 2017-08-01 23:59:25,815 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.2eafea07-be8a-4f7c-acf8-f91e85c5ee24 dst=null perm=null proto=rpc 2017-08-01 23:59:25,816 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:59:35,817 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.4a309f87-def1-41f2-af60-ca0feb82656c dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-01 23:59:35,820 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.4a309f87-def1-41f2-af60-ca0feb82656c dst=null perm=null proto=rpc 2017-08-01 23:59:35,821 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.4a309f87-def1-41f2-af60-ca0feb82656c dst=null perm=null proto=rpc 2017-08-01 23:59:35,821 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:59:45,822 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.e2052c38-a156-4df4-a1b5-04b0ee8a9cd5 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-01 23:59:45,824 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.e2052c38-a156-4df4-a1b5-04b0ee8a9cd5 dst=null perm=null proto=rpc 2017-08-01 23:59:45,825 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.e2052c38-a156-4df4-a1b5-04b0ee8a9cd5 dst=null perm=null proto=rpc 2017-08-01 23:59:45,825 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:59:46,204 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-08-01 23:59:55,826 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.963bda65-646d-49b1-bf8d-bc26f19ce450 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-01 23:59:55,828 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.963bda65-646d-49b1-bf8d-bc26f19ce450 dst=null perm=null proto=rpc 2017-08-01 23:59:55,829 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.963bda65-646d-49b1-bf8d-bc26f19ce450 dst=null perm=null proto=rpc 2017-08-01 23:59:55,829 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-01 23:59:58,760 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/gc.log-201707242357 <== Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8) Memory: 4k page, physical 2053820k(69836k free), swap 8241148k(7816532k free) CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=1073741824 -XX:MaxHeapSize=1073741824 -XX:MaxNewSize=134217728 -XX:MaxTenuringThreshold=6 -XX:NewSize=134217728 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin/kill-secondary-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin/kill-secondary-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-secondarynamenode/bin/kill-secondary-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC 2017-07-24T23:57:18.175-0400: 3.092: [GC (Allocation Failure) 2017-07-24T23:57:18.177-0400: 3.094: [ParNew: 104960K->10884K(118016K), 0.0354096 secs] 104960K->10884K(1035520K), 0.0671340 secs] [Times: user=0.03 sys=0.01, real=0.07 secs] 2017-07-24T23:58:19.759-0400: 64.677: [GC (Allocation Failure) 2017-07-24T23:58:19.760-0400: 64.677: [ParNew: 115844K->13056K(118016K), 0.0881963 secs] 115844K->26298K(1035520K), 0.0885351 secs] [Times: user=0.18 sys=0.02, real=0.09 secs] 2017-07-24T23:58:19.855-0400: 64.772: [GC (CMS Initial Mark) [1 CMS-initial-mark: 13242K(917504K)] 26314K(1035520K), 0.0139270 secs] [Times: user=0.02 sys=0.02, real=0.02 secs] 2017-07-24T23:58:19.869-0400: 64.786: [CMS-concurrent-mark-start] 2017-07-24T23:58:19.879-0400: 64.797: [CMS-concurrent-mark: 0.008/0.010 secs] [Times: user=0.03 sys=0.00, real=0.01 secs] 2017-07-24T23:58:19.879-0400: 64.797: [CMS-concurrent-preclean-start] 2017-07-24T23:58:19.881-0400: 64.798: [CMS-concurrent-preclean: 0.002/0.002 secs] [Times: user=0.01 sys=0.00, real=0.00 secs] 2017-07-24T23:58:19.881-0400: 64.799: [GC (CMS Final Remark) [YG occupancy: 13555 K (118016 K)]2017-07-24T23:58:19.881-0400: 64.799: [Rescan (parallel) , 0.0096662 secs]2017-07-24T23:58:19.891-0400: 64.808: [weak refs processing, 0.0000329 secs]2017-07-24T23:58:19.891-0400: 64.808: [class unloading, 0.0037232 secs]2017-07-24T23:58:19.895-0400: 64.812: [scrub symbol table, 0.0019585 secs]2017-07-24T23:58:19.896-0400: 64.814: [scrub string table, 0.0005074 secs][1 CMS-remark: 13242K(917504K)] 26797K(1035520K), 0.0165212 secs] [Times: user=0.04 sys=0.00, real=0.02 secs] 2017-07-24T23:58:19.898-0400: 64.815: [CMS-concurrent-sweep-start] 2017-07-24T23:58:19.901-0400: 64.818: [CMS-concurrent-sweep: 0.003/0.003 secs] [Times: user=0.01 sys=0.00, real=0.00 secs] 2017-07-24T23:58:19.901-0400: 64.818: [CMS-concurrent-reset-start] 2017-07-24T23:58:19.915-0400: 64.833: [CMS-concurrent-reset: 0.015/0.015 secs] [Times: user=0.02 sys=0.01, real=0.01 secs] 2017-07-25T00:59:24.719-0400: 3729.637: [GC (Allocation Failure) 2017-07-25T00:59:24.721-0400: 3729.638: [ParNew: 118016K->3453K(118016K), 0.7490724 secs] 131250K->27873K(1035520K), 0.7522521 secs] [Times: user=1.36 sys=0.12, real=0.75 secs] 2017-07-25T02:53:28.024-0400: 10572.942: [GC (Allocation Failure) 2017-07-25T02:53:28.029-0400: 10572.947: [ParNew: 108413K->902K(118016K), 0.2749492 secs] 132833K->25321K(1035520K), 0.2809283 secs] [Times: user=0.13 sys=0.05, real=0.28 secs] 2017-07-25T05:11:31.586-0400: 18856.504: [GC (Allocation Failure) 2017-07-25T05:11:31.589-0400: 18856.507: [ParNew: 105862K->1087K(118016K), 0.3391028 secs] 130281K->25507K(1035520K), 0.3449871 secs] [Times: user=0.33 sys=0.07, real=0.34 secs] 2017-07-25T06:34:38.126-0400: 23843.044: [GC (Allocation Failure) 2017-07-25T06:34:38.129-0400: 23843.047: [ParNew: 106047K->865K(118016K), 0.3804131 secs] 130467K->25285K(1035520K), 0.3850217 secs] [Times: user=0.30 sys=0.07, real=0.39 secs] 2017-07-25T08:51:42.901-0400: 32067.819: [GC (Allocation Failure) 2017-07-25T08:51:42.904-0400: 32067.822: [ParNew: 105825K->1009K(118016K), 0.4197626 secs] 130245K->25429K(1035520K), 0.4241379 secs] [Times: user=0.60 sys=0.06, real=0.42 secs] Heap par new generation total 118016K, used 15438K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 13% used [0x00000000c0000000, 0x00000000c0e17460, 0x00000000c6680000) from space 13056K, 7% used [0x00000000c7340000, 0x00000000c743c520, 0x00000000c8000000) to space 13056K, 0% used [0x00000000c6680000, 0x00000000c6680000, 0x00000000c7340000) concurrent mark-sweep generation total 917504K, used 24419K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 25540K, capacity 25820K, committed 26236K, reserved 1073152K class space used 2819K, capacity 2906K, committed 2940K, reserved 1048576K ==> /var/log/hadoop/hdfs/gc.log-201708091245 <== Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8) Memory: 4k page, physical 2053820k(194628k free), swap 8241148k(7933844k free) CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=1073741824 -XX:MaxHeapSize=1073741824 -XX:MaxNewSize=134217728 -XX:MaxTenuringThreshold=6 -XX:NewSize=134217728 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC 2017-08-09T12:45:20.572-0400: 1.170: [GC (Allocation Failure) 2017-08-09T12:45:20.573-0400: 1.170: [ParNew: 104960K->11632K(118016K), 0.0254234 secs] 104960K->11632K(1035520K), 0.0255190 secs] [Times: user=0.06 sys=0.00, real=0.03 secs] Heap par new generation total 118016K, used 21762K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 9% used [0x00000000c0000000, 0x00000000c09e47f0, 0x00000000c6680000) from space 13056K, 89% used [0x00000000c7340000, 0x00000000c7e9c2d8, 0x00000000c8000000) to space 13056K, 0% used [0x00000000c6680000, 0x00000000c6680000, 0x00000000c7340000) concurrent mark-sweep generation total 917504K, used 0K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 16148K, capacity 16382K, committed 16768K, reserved 1064960K class space used 1963K, capacity 2061K, committed 2176K, reserved 1048576K ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-08-04 <== 2017-08-04 23:58:34,566 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.f0fd4d19-f960-4056-8e64-9102719ef000 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:58:34,568 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.f0fd4d19-f960-4056-8e64-9102719ef000 dst=null perm=null proto=rpc 2017-08-04 23:58:34,569 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.f0fd4d19-f960-4056-8e64-9102719ef000 dst=null perm=null proto=rpc 2017-08-04 23:58:34,569 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:58:44,570 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.d07c3db9-32e6-4f6f-b00c-5a496ed47f1b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:58:44,572 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.d07c3db9-32e6-4f6f-b00c-5a496ed47f1b dst=null perm=null proto=rpc 2017-08-04 23:58:44,573 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.d07c3db9-32e6-4f6f-b00c-5a496ed47f1b dst=null perm=null proto=rpc 2017-08-04 23:58:44,573 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:58:54,575 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.189f2a1c-8b3e-4cfc-ad3f-ca09c68f65c5 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:58:54,576 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.189f2a1c-8b3e-4cfc-ad3f-ca09c68f65c5 dst=null perm=null proto=rpc 2017-08-04 23:58:54,577 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.189f2a1c-8b3e-4cfc-ad3f-ca09c68f65c5 dst=null perm=null proto=rpc 2017-08-04 23:58:54,578 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:58:58,824 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-08-04 23:58:59,298 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-08-04 23:59:04,579 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.588bc06d-f4d1-4281-a2f6-0005cef29d0b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:59:04,581 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.588bc06d-f4d1-4281-a2f6-0005cef29d0b dst=null perm=null proto=rpc 2017-08-04 23:59:04,581 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.588bc06d-f4d1-4281-a2f6-0005cef29d0b dst=null perm=null proto=rpc 2017-08-04 23:59:04,582 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:59:14,583 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b6e9c00e-a6b3-4b4e-a3b3-c9a3025267f8 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:59:14,585 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b6e9c00e-a6b3-4b4e-a3b3-c9a3025267f8 dst=null perm=null proto=rpc 2017-08-04 23:59:14,586 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b6e9c00e-a6b3-4b4e-a3b3-c9a3025267f8 dst=null perm=null proto=rpc 2017-08-04 23:59:14,588 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:59:24,592 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b019b23a-0b09-4d0e-8db2-88ab51bf5b9e dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:59:24,593 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b019b23a-0b09-4d0e-8db2-88ab51bf5b9e dst=null perm=null proto=rpc 2017-08-04 23:59:24,595 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b019b23a-0b09-4d0e-8db2-88ab51bf5b9e dst=null perm=null proto=rpc 2017-08-04 23:59:24,595 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:59:34,597 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b69b3559-7b97-49c1-a282-3a7028734f68 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:59:34,598 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b69b3559-7b97-49c1-a282-3a7028734f68 dst=null perm=null proto=rpc 2017-08-04 23:59:34,599 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b69b3559-7b97-49c1-a282-3a7028734f68 dst=null perm=null proto=rpc 2017-08-04 23:59:34,599 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:59:44,601 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.d760be70-bf35-4d1c-b881-ebb966363cb0 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:59:44,602 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.d760be70-bf35-4d1c-b881-ebb966363cb0 dst=null perm=null proto=rpc 2017-08-04 23:59:44,603 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.d760be70-bf35-4d1c-b881-ebb966363cb0 dst=null perm=null proto=rpc 2017-08-04 23:59:44,603 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:59:54,605 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.4f49dced-3444-4790-8f9e-80a26a47c685 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-04 23:59:54,606 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.4f49dced-3444-4790-8f9e-80a26a47c685 dst=null perm=null proto=rpc 2017-08-04 23:59:54,608 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.4f49dced-3444-4790-8f9e-80a26a47c685 dst=null perm=null proto=rpc 2017-08-04 23:59:54,608 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-04 23:59:58,871 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-08-04 23:59:59,275 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/hadoop-hdfs-datanode-USGVLHDP01.Coveris.out <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-23 <== 2017-07-23 15:44:00,659 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.c89c52a1-ea46-4f4c-bb5f-74560af16016 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-23 15:44:00,661 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.c89c52a1-ea46-4f4c-bb5f-74560af16016 dst=null perm=null proto=rpc 2017-07-23 15:44:00,662 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.c89c52a1-ea46-4f4c-bb5f-74560af16016 dst=null perm=null proto=rpc 2017-07-23 15:44:00,662 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-23 15:44:08,334 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-23 15:44:10,663 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.8c637237-20e4-4f44-83d5-f512b5bb7bfc dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-23 15:44:10,668 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.8c637237-20e4-4f44-83d5-f512b5bb7bfc dst=null perm=null proto=rpc 2017-07-23 15:44:10,669 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.8c637237-20e4-4f44-83d5-f512b5bb7bfc dst=null perm=null proto=rpc 2017-07-23 15:44:10,669 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-23 15:44:20,698 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.83167631-150f-4da6-8fbb-977cec5a9748 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-23 15:44:20,730 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.83167631-150f-4da6-8fbb-977cec5a9748 dst=null perm=null proto=rpc 2017-07-23 15:44:20,732 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.83167631-150f-4da6-8fbb-977cec5a9748 dst=null perm=null proto=rpc 2017-07-23 15:44:20,732 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-23 15:44:30,734 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.228657da-71c2-44ef-9981-c7e25b8e9e2e dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-23 15:44:30,735 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.228657da-71c2-44ef-9981-c7e25b8e9e2e dst=null perm=null proto=rpc 2017-07-23 15:44:30,736 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.228657da-71c2-44ef-9981-c7e25b8e9e2e dst=null perm=null proto=rpc 2017-07-23 15:44:30,737 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-23 15:44:31,503 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-23 15:44:32,437 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-23 15:44:40,741 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.5fd74bd5-3732-4aa0-91ee-8d732e291e07 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-23 15:44:40,744 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.5fd74bd5-3732-4aa0-91ee-8d732e291e07 dst=null perm=null proto=rpc 2017-07-23 15:44:40,748 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.5fd74bd5-3732-4aa0-91ee-8d732e291e07 dst=null perm=null proto=rpc 2017-07-23 15:44:40,748 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-23 15:44:43,286 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-23 15:45:31,505 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-23 15:45:32,437 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-23 15:45:43,216 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-23 15:46:31,508 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-23 15:46:32,438 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-23 15:46:43,247 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-23 15:47:08,334 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-23 15:47:31,510 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-23 15:47:32,437 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-23 15:47:43,311 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-23 15:48:31,512 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-23 15:48:32,437 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-23 15:48:43,378 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-23 15:49:31,514 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-23 15:49:32,437 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-23 15:49:43,287 INFO FSNamesystem.audit: all==> /var/log/hadoop/hdfs/gc.log-201708071249 <== Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8) Memory: 4k page, physical 2053820k(349284k free), swap 8241148k(7924036k free) CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=1073741824 -XX:MaxHeapSize=1073741824 -XX:MaxNewSize=134217728 -XX:MaxTenuringThreshold=6 -XX:NewSize=134217728 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC 2017-08-07T12:49:37.622-0400: 1.069: [GC (Allocation Failure) 2017-08-07T12:49:37.622-0400: 1.069: [ParNew: 104960K->11628K(118016K), 0.0108360 secs] 104960K->11628K(1035520K), 0.0109252 secs] [Times: user=0.02 sys=0.00, real=0.01 secs] Heap par new generation total 118016K, used 21758K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 9% used [0x00000000c0000000, 0x00000000c09e46c0, 0x00000000c6680000) from space 13056K, 89% used [0x00000000c7340000, 0x00000000c7e9b148, 0x00000000c8000000) to space 13056K, 0% used [0x00000000c6680000, 0x00000000c6680000, 0x00000000c7340000) concurrent mark-sweep generation total 917504K, used 0K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 16145K, capacity 16382K, committed 16768K, reserved 1064960K class space used 1963K, capacity 2061K, committed 2176K, reserved 1048576K ==> /var/log/hadoop/hdfs/hadoop-hdfs-secondarynamenode-USGVLHDP01.Coveris.local.log <== 2017-08-07 12:51:39,050 INFO namenode.SecondaryNameNode (LogAdapter.java:info(45)) - STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting SecondaryNameNode STARTUP_MSG: host = USGVLHDP01.Coveris.local/10.6.240.213 STARTUP_MSG: args = [] STARTUP_MSG: version = 2.7.1.2.4.2.0-258 STARTUP_MSG: classpath = /usr/hdp/current/hadoop-client/conf:/usr/hdp/2.4.2.0-258/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-plugin-classloader-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ojdbc6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/spark-yarn-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-common-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.4.2.0-258/hadoop/.//hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/./:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/zookeeper-3.4.6.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//zookeeper-3.4.6.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//joda-time-2.9.3.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.4.2.0-258.jar::mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.4.2.0-258/tez/tez-examples-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-library-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-tests-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-fs-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-dag-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-common-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-acls-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-internals-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-history-parser-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-cache-plugin-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-mapreduce-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-api-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/tez/conf:mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.4.2.0-258/tez/tez-examples-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-library-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-tests-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-fs-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-dag-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-common-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-history-with-acls-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-runtime-internals-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-history-parser-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-yarn-timeline-cache-plugin-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-mapreduce-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/tez-api-0.7.0.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-io-2.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/guava-11.0.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-azure-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-annotations-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.4.2.0-258/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.4.2.0-258/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.4.2.0-258/tez/lib/jsr305-2.0.3.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/slf4j-api-1.7.5.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-aws-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/hadoop-yarn-server-web-proxy-2.7.1.2.4.2.0-258.jar:/usr/hdp/2.4.2.0-258/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.4.2.0-258/tez/conf STARTUP_MSG: build = git@github.com:hortonworks/hadoop.git -r 13debf893a605e8a88df18a7d8d214f571e05289; compiled by 'jenkins' on 2016-04-25T05:46Z STARTUP_MSG: java = 1.8.0_77 ************************************************************/ 2017-08-07 12:51:39,060 INFO namenode.SecondaryNameNode (LogAdapter.java:info(45)) - registered UNIX signal handlers for [TERM, HUP, INT] 2017-08-07 12:51:39,324 WARN hdfs.DFSUtil (DFSUtil.java:getAddressesForNameserviceId(689)) - Namenode for null remains unresolved for ID null. Check your hdfs-site.xml file to ensure namenodes are configured properly. 2017-08-07 12:51:39,491 INFO impl.MetricsConfig (MetricsConfig.java:loadFirst(112)) - loaded properties from hadoop-metrics2.properties 2017-08-07 12:51:39,546 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(375)) - Scheduled snapshot period at 10 second(s). 2017-08-07 12:51:39,546 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:start(192)) - SecondaryNameNode metrics system started 2017-08-07 12:51:39,562 INFO namenode.SecondaryNameNode (LogAdapter.java:info(45)) - SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down SecondaryNameNode at USGVLHDP01.Coveris.local/10.6.240.213 ************************************************************/ ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.local.out.2 <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hadoop-hdfs-datanode-USGVLHDP01.Coveris.local.out <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.local.log <== at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:892) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:720) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:951) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:935) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1641) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1707) Caused by: java.nio.channels.UnresolvedAddressException at sun.nio.ch.Net.checkAddress(Net.java:101) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) ... 10 more 2017-08-09 12:45:20,630 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(211)) - Stopping NameNode metrics system... 2017-08-09 12:45:20,630 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(217)) - NameNode metrics system stopped. 2017-08-09 12:45:20,630 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(605)) - NameNode metrics system shutdown complete. 2017-08-09 12:45:20,630 ERROR namenode.NameNode (NameNode.java:main(1712)) - Failed to start namenode. java.net.SocketException: Unresolved address at sun.nio.ch.Net.translateToSocketException(Net.java:131) at sun.nio.ch.Net.translateException(Net.java:157) at sun.nio.ch.Net.translateException(Net.java:163) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:76) at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216) at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:914) at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:856) at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:156) at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:892) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:720) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:951) at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:935) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1641) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1707) Caused by: java.nio.channels.UnresolvedAddressException at sun.nio.ch.Net.checkAddress(Net.java:101) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) ... 10 more 2017-08-09 12:45:20,632 INFO util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1 2017-08-09 12:45:20,633 INFO namenode.NameNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at USGVLHDP01.Coveris.local/10.6.240.213 ************************************************************/ ==> /var/log/hadoop/hdfs/hadoop-hdfs-secondarynamenode-USGVLHDP01.Coveris.out.1 <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.out.1 <== Jul 24, 2017 11:56:08 PM com.sun.jersey.api.core.PackagesResourceConfig init INFO: Scanning for root resource and provider classes in the packages: org.apache.hadoop.hdfs.server.namenode.web.resources org.apache.hadoop.hdfs.web.resources Jul 24, 2017 11:56:09 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses INFO: Root resource classes found: class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods Jul 24, 2017 11:56:09 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses INFO: Provider classes found: class org.apache.hadoop.hdfs.web.resources.ExceptionHandler class org.apache.hadoop.hdfs.web.resources.UserProvider Jul 24, 2017 11:56:09 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' Jul 24, 2017 11:56:09 PM com.sun.jersey.spi.inject.Errors processErrorMessages WARNING: The following warnings have been detected with resource and/or provider classes: WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-31 <== 2017-07-31 23:58:38,441 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.d3f4d292-9dd5-47f0-ad5d-3902362d4317 dst=null perm=null proto=rpc 2017-07-31 23:58:38,441 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.d3f4d292-9dd5-47f0-ad5d-3902362d4317 dst=null perm=null proto=rpc 2017-07-31 23:58:38,442 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:58:46,149 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-31 23:58:48,443 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.0fd26bd3-e8bf-46ed-9aec-7de770c7a9b6 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-31 23:58:48,445 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.0fd26bd3-e8bf-46ed-9aec-7de770c7a9b6 dst=null perm=null proto=rpc 2017-07-31 23:58:48,446 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.0fd26bd3-e8bf-46ed-9aec-7de770c7a9b6 dst=null perm=null proto=rpc 2017-07-31 23:58:48,446 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:58:58,448 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.ddd880e5-41a1-4212-8875-acd23db4001b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-31 23:58:58,450 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.ddd880e5-41a1-4212-8875-acd23db4001b dst=null perm=null proto=rpc 2017-07-31 23:58:58,451 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.ddd880e5-41a1-4212-8875-acd23db4001b dst=null perm=null proto=rpc 2017-07-31 23:58:58,453 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:58:58,753 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-31 23:59:08,455 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b124eb72-7e12-4c75-b332-629776fcf2de dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-31 23:59:08,456 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b124eb72-7e12-4c75-b332-629776fcf2de dst=null perm=null proto=rpc 2017-07-31 23:59:08,457 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b124eb72-7e12-4c75-b332-629776fcf2de dst=null perm=null proto=rpc 2017-07-31 23:59:08,457 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:59:09,751 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-31 23:59:18,458 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.56ac2bec-8540-4f20-93ac-3dec37c9f800 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-31 23:59:18,460 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.56ac2bec-8540-4f20-93ac-3dec37c9f800 dst=null perm=null proto=rpc 2017-07-31 23:59:18,461 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.56ac2bec-8540-4f20-93ac-3dec37c9f800 dst=null perm=null proto=rpc 2017-07-31 23:59:18,461 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:59:28,463 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.843e9c54-d1c5-4a52-a781-d321e00ab6fb dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-31 23:59:28,464 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.843e9c54-d1c5-4a52-a781-d321e00ab6fb dst=null perm=null proto=rpc 2017-07-31 23:59:28,465 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.843e9c54-d1c5-4a52-a781-d321e00ab6fb dst=null perm=null proto=rpc 2017-07-31 23:59:28,465 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:59:38,469 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.03bce79c-6e03-49fa-8850-7570a9d7966e dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-31 23:59:38,472 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.03bce79c-6e03-49fa-8850-7570a9d7966e dst=null perm=null proto=rpc 2017-07-31 23:59:38,475 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.03bce79c-6e03-49fa-8850-7570a9d7966e dst=null perm=null proto=rpc 2017-07-31 23:59:38,477 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:59:46,121 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-31 23:59:48,483 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.c029184e-51db-4d89-822b-3880cec429d5 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-31 23:59:48,487 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.c029184e-51db-4d89-822b-3880cec429d5 dst=null perm=null proto=rpc 2017-07-31 23:59:48,490 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.c029184e-51db-4d89-822b-3880cec429d5 dst=null perm=null proto=rpc 2017-07-31 23:59:48,491 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:59:58,496 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.67a0b16a-647c-4531-8a19-a7852a043ea6 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-31 23:59:58,501 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.67a0b16a-647c-4531-8a19-a7852a043ea6 dst=null perm=null proto=rpc 2017-07-31 23:59:58,504 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.67a0b16a-647c-4531-8a19-a7852a043ea6 dst=null perm=null proto=rpc 2017-07-31 23:59:58,506 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-31 23:59:58,764 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-08-05 <== 2017-08-05 23:58:29,231 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-08-05 23:58:38,985 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.dd625f3b-a706-4a30-b61a-8641786e3b96 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:58:38,987 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.dd625f3b-a706-4a30-b61a-8641786e3b96 dst=null perm=null proto=rpc 2017-08-05 23:58:38,988 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.dd625f3b-a706-4a30-b61a-8641786e3b96 dst=null perm=null proto=rpc 2017-08-05 23:58:38,988 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:58:48,990 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.d426cdd4-97e9-466d-ace5-324b8a0772a5 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:58:48,992 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.d426cdd4-97e9-466d-ace5-324b8a0772a5 dst=null perm=null proto=rpc 2017-08-05 23:58:48,993 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.d426cdd4-97e9-466d-ace5-324b8a0772a5 dst=null perm=null proto=rpc 2017-08-05 23:58:48,994 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:58:58,995 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.bd22b677-e4ce-4523-a328-190a90d77a00 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:58:58,997 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.bd22b677-e4ce-4523-a328-190a90d77a00 dst=null perm=null proto=rpc 2017-08-05 23:58:58,998 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.bd22b677-e4ce-4523-a328-190a90d77a00 dst=null perm=null proto=rpc 2017-08-05 23:58:58,998 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:58:59,276 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-08-05 23:59:09,000 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.3d9eec26-fbca-4d72-88ca-37f7b7d0cc3c dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:59:09,001 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.3d9eec26-fbca-4d72-88ca-37f7b7d0cc3c dst=null perm=null proto=rpc 2017-08-05 23:59:09,002 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.3d9eec26-fbca-4d72-88ca-37f7b7d0cc3c dst=null perm=null proto=rpc 2017-08-05 23:59:09,002 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:59:19,003 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.f25e1413-0ffe-416d-a255-19721dd23610 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:59:19,005 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.f25e1413-0ffe-416d-a255-19721dd23610 dst=null perm=null proto=rpc 2017-08-05 23:59:19,006 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.f25e1413-0ffe-416d-a255-19721dd23610 dst=null perm=null proto=rpc 2017-08-05 23:59:19,006 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:59:29,008 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.610186f6-0686-4bd9-8824-3f8ab78226ae dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:59:29,009 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.610186f6-0686-4bd9-8824-3f8ab78226ae dst=null perm=null proto=rpc 2017-08-05 23:59:29,010 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.610186f6-0686-4bd9-8824-3f8ab78226ae dst=null perm=null proto=rpc 2017-08-05 23:59:29,010 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:59:29,261 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-08-05 23:59:39,012 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.8878cf3b-b063-4156-a052-c09f5f2610d5 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:59:39,013 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.8878cf3b-b063-4156-a052-c09f5f2610d5 dst=null perm=null proto=rpc 2017-08-05 23:59:39,014 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.8878cf3b-b063-4156-a052-c09f5f2610d5 dst=null perm=null proto=rpc 2017-08-05 23:59:39,014 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:59:49,017 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.86dcb6d0-4f0f-47df-b5eb-a786aeb36aa6 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:59:49,019 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.86dcb6d0-4f0f-47df-b5eb-a786aeb36aa6 dst=null perm=null proto=rpc 2017-08-05 23:59:49,020 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.86dcb6d0-4f0f-47df-b5eb-a786aeb36aa6 dst=null perm=null proto=rpc 2017-08-05 23:59:49,021 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:59:59,022 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.3fd55d2c-4171-4d77-ba61-b34d60b03c13 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-05 23:59:59,024 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.3fd55d2c-4171-4d77-ba61-b34d60b03c13 dst=null perm=null proto=rpc 2017-08-05 23:59:59,026 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.3fd55d2c-4171-4d77-ba61-b34d60b03c13 dst=null perm=null proto=rpc 2017-08-05 23:59:59,026 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-05 23:59:59,274 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-24 <== 2017-07-24 23:59:31,528 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive dst=null perm=null proto=rpc 2017-07-24 23:59:31,532 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive dst=null perm=null proto=rpc 2017-07-24 23:59:31,534 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/anonymous dst=null perm=null proto=rpc 2017-07-24 23:59:31,542 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534 dst=null perm=null proto=rpc 2017-07-24 23:59:31,548 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=mkdirs src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534 dst=null perm=anonymous:hdfs:rwx------ proto=rpc 2017-07-24 23:59:31,551 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534 dst=null perm=null proto=rpc 2017-07-24 23:59:31,555 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534/_tmp_space.db dst=null perm=null proto=rpc 2017-07-24 23:59:31,566 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=mkdirs src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534/_tmp_space.db dst=null perm=anonymous:hdfs:rwx------ proto=rpc 2017-07-24 23:59:31,570 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534/_tmp_space.db dst=null perm=null proto=rpc 2017-07-24 23:59:31,928 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=delete src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534 dst=null perm=null proto=rpc callerContext=HIVE_SSN_ID:0aa1857a-85e8-45af-acf1-b859f493b534 2017-07-24 23:59:31,965 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534 dst=null perm=null proto=rpc 2017-07-24 23:59:31,967 INFO FSNamesystem.audit: allowed=true ugi=anonymous (auth:PROXY) via hive (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/anonymous/0aa1857a-85e8-45af-acf1-b859f493b534/_tmp_space.db dst=null perm=null proto=rpc 2017-07-24 23:59:34,569 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive dst=null perm=null proto=rpc 2017-07-24 23:59:34,614 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive dst=null perm=null proto=rpc 2017-07-24 23:59:34,616 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/ambari-qa dst=null perm=null proto=rpc 2017-07-24 23:59:34,671 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f dst=null perm=null proto=rpc 2017-07-24 23:59:34,681 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=mkdirs src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f dst=null perm=ambari-qa:hdfs:rwx------ proto=rpc 2017-07-24 23:59:34,684 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f dst=null perm=null proto=rpc 2017-07-24 23:59:34,690 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f/_tmp_space.db dst=null perm=null proto=rpc 2017-07-24 23:59:34,692 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=mkdirs src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f/_tmp_space.db dst=null perm=ambari-qa:hdfs:rwx------ proto=rpc 2017-07-24 23:59:34,694 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f/_tmp_space.db dst=null perm=null proto=rpc 2017-07-24 23:59:37,070 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/ats/active dst=null perm=null proto=rpc callerContext=HIVE_QUERY_ID:ambari-qa_20170724235934_6e051597-ad52-4112-adb5-cfcee4c07431 2017-07-24 23:59:37,409 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=delete src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f dst=null perm=null proto=rpc 2017-07-24 23:59:37,464 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f dst=null perm=null proto=rpc 2017-07-24 23:59:37,471 INFO FSNamesystem.audit: allowed=true ugi=ambari-qa (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/tmp/hive/ambari-qa/eb0db221-7531-4d4c-96c7-9f7ea614458f/_tmp_space.db dst=null perm=null proto=rpc 2017-07-24 23:59:38,664 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.ede4df22-29e9-4976-b45c-4d73bf9ed3b1 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-24 23:59:38,674 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.ede4df22-29e9-4976-b45c-4d73bf9ed3b1 dst=null perm=null proto=rpc 2017-07-24 23:59:38,677 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.ede4df22-29e9-4976-b45c-4d73bf9ed3b1 dst=null perm=null proto=rpc 2017-07-24 23:59:38,678 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-24 23:59:41,179 INFO FSNamesystem.audit: allowed=true ugi=hdfs (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/ats/done/ dst=null perm=null proto=webhdfs 2017-07-24 23:59:41,212 INFO FSNamesystem.audit: allowed=true ugi=hdfs (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/ats/active/ dst=null perm=null proto=webhdfs 2017-07-24 23:59:48,685 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.07c15a60-cd3f-4727-ac39-2d4a28653d07 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-24 23:59:48,693 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.07c15a60-cd3f-4727-ac39-2d4a28653d07 dst=null perm=null proto=rpc 2017-07-24 23:59:48,698 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.07c15a60-cd3f-4727-ac39-2d4a28653d07 dst=null perm=null proto=rpc 2017-07-24 23:59:48,700 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-24 23:59:48,909 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=getfileinfo src=/ats/active dst=null perm=null proto=rpc 2017-07-24 23:59:58,714 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.a0e92e59-9495-4845-8bdb-16b678460205 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-24 23:59:58,726 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.a0e92e59-9495-4845-8bdb-16b678460205 dst=null perm=null proto=rpc 2017-07-24 23:59:58,739 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.a0e92e59-9495-4845-8bdb-16b678460205 dst=null perm=null proto=rpc 2017-07-24 23:59:58,744 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.log <== 2017-08-06 21:33:04,509 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:04,511 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:04,512 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:07,512 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:07,513 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:07,513 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:10,513 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:10,514 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:10,514 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:11,448 INFO hdfs.StateChange (FSNamesystem.java:completeFile(3545)) - DIR* completeFile: /spark-history/.5347cc07-e087-4e66-a99f-9dd778c5cfbb is closed by DFSClient_NONMAPREDUCE_1961935643_1 2017-08-06 21:33:13,515 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:13,516 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:13,516 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:16,516 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:16,518 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:16,518 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:19,518 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:19,520 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:19,520 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:21,452 INFO hdfs.StateChange (FSNamesystem.java:completeFile(3545)) - DIR* completeFile: /spark-history/.751ca542-d201-4063-85df-ee072a0bf9de is closed by DFSClient_NONMAPREDUCE_1961935643_1 2017-08-06 21:33:22,520 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:22,521 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:22,521 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:25,521 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:25,523 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:25,523 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:28,523 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:28,524 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:28,524 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:31,525 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 4 blocks at priority level 2; Total=4 Reset bookmarks? false 2017-08-06 21:33:31,525 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:31,525 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 4; of which 4 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:33,844 ERROR namenode.NameNode (LogAdapter.java:error(71)) - RECEIVED SIGNAL 15: SIGTERM 2017-08-06 21:33:34,584 INFO BlockStateChange (UnderReplicatedBlocks.java:chooseUnderReplicatedBlocks(395)) - chooseUnderReplicatedBlocks selected 1 blocks at priority level 2; Total=1 Reset bookmarks? true 2017-08-06 21:33:34,587 INFO BlockStateChange (BlockManager.java:computeReplicationWorkForBlocks(1531)) - BLOCK* neededReplications = 693, pendingReplications = 0. 2017-08-06 21:33:34,587 INFO blockmanagement.BlockManager (BlockManager.java:computeReplicationWorkForBlocks(1538)) - Blocks chosen but could not be replicated = 1; of which 1 have no target, 0 have no source, 0 are UC, 0 are abandoned, 0 already have enough replicas. 2017-08-06 21:33:35,124 INFO namenode.NameNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at USGVLHDP01.Coveris/10.6.240.213 ************************************************************/ ==> /var/log/hadoop/hdfs/hadoop-hdfs-namenode-USGVLHDP01.Coveris.local.out.3 <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/gc.log-201707251248 <== 2017-08-05T17:18:30.507-0400: 966618.523: [GC (Allocation Failure) 2017-08-05T17:18:30.513-0400: 966618.528: [ParNew: 105080K->71K(118016K), 0.7127575 secs] 126709K->21700K(1035520K), 0.7292927 secs] [Times: user=0.99 sys=0.06, real=0.72 secs] 2017-08-05T18:10:39.789-0400: 969747.803: [GC (Allocation Failure) 2017-08-05T18:10:39.789-0400: 969747.804: [ParNew: 105031K->58K(118016K), 0.1364277 secs] 126660K->21693K(1035520K), 0.1369592 secs] [Times: user=0.14 sys=0.03, real=0.14 secs] 2017-08-05T19:15:50.508-0400: 973658.524: [GC (Allocation Failure) 2017-08-05T19:15:50.511-0400: 973658.526: [ParNew: 105018K->55K(118016K), 0.7133188 secs] 126653K->21690K(1035520K), 0.7224341 secs] [Times: user=1.60 sys=0.05, real=0.73 secs] 2017-08-05T20:08:02.415-0400: 976790.429: [GC (Allocation Failure) 2017-08-05T20:08:02.415-0400: 976790.430: [ParNew: 105015K->153K(118016K), 0.3074654 secs] 126650K->21788K(1035520K), 0.3094362 secs] [Times: user=0.67 sys=0.02, real=0.31 secs] 2017-08-05T21:12:14.083-0400: 980642.099: [GC (Allocation Failure) 2017-08-05T21:12:14.085-0400: 980642.100: [ParNew: 105113K->73K(118016K), 0.3857673 secs] 126748K->21709K(1035520K), 0.3884062 secs] [Times: user=0.37 sys=0.05, real=0.39 secs] 2017-08-05T22:04:23.566-0400: 983771.582: [GC (Allocation Failure) 2017-08-05T22:04:23.568-0400: 983771.583: [ParNew: 105033K->45K(118016K), 0.4969442 secs] 126669K->21680K(1035520K), 0.5010486 secs] [Times: user=0.60 sys=0.06, real=0.50 secs] 2017-08-05T22:55:33.869-0400: 986841.884: [GC (Allocation Failure) 2017-08-05T22:55:33.871-0400: 986841.885: [ParNew: 105005K->63K(118016K), 0.3540840 secs] 126640K->21698K(1035520K), 0.3558875 secs] [Times: user=0.93 sys=0.03, real=0.36 secs] 2017-08-05T23:47:43.142-0400: 989971.157: [GC (Allocation Failure) 2017-08-05T23:47:43.143-0400: 989971.158: [ParNew: 105023K->228K(118016K), 0.5216244 secs] 126658K->21864K(1035520K), 0.5253168 secs] [Times: user=0.44 sys=0.07, real=0.52 secs] 2017-08-06T00:39:57.809-0400: 993105.825: [GC (Allocation Failure) 2017-08-06T00:39:57.811-0400: 993105.826: [ParNew: 105188K->128K(118016K), 0.5049866 secs] 126824K->21764K(1035520K), 0.5160203 secs] [Times: user=1.32 sys=0.04, real=0.52 secs] 2017-08-06T01:31:09.336-0400: 996177.350: [GC (Allocation Failure) 2017-08-06T01:31:09.338-0400: 996177.352: [ParNew: 105088K->299K(118016K), 0.4648232 secs] 126724K->21934K(1035520K), 0.4678672 secs] [Times: user=0.60 sys=0.05, real=0.47 secs] 2017-08-06T02:34:19.900-0400: 999967.916: [GC (Allocation Failure) 2017-08-06T02:34:19.904-0400: 999967.918: [ParNew: 105259K->141K(118016K), 0.4383500 secs] 126894K->21776K(1035520K), 0.4443421 secs] [Times: user=1.11 sys=0.04, real=0.44 secs] 2017-08-06T03:25:29.660-0400: 1003037.675: [GC (Allocation Failure) 2017-08-06T03:25:29.661-0400: 1003037.676: [ParNew: 105101K->71K(118016K), 0.3629261 secs] 126736K->21706K(1035520K), 0.3642973 secs] [Times: user=0.33 sys=0.03, real=0.37 secs] 2017-08-06T04:16:37.281-0400: 1006105.297: [GC (Allocation Failure) 2017-08-06T04:16:37.283-0400: 1006105.298: [ParNew: 105031K->48K(118016K), 0.6406400 secs] 126666K->21683K(1035520K), 0.6439429 secs] [Times: user=1.33 sys=0.08, real=0.64 secs] 2017-08-06T05:07:48.457-0400: 1009176.472: [GC (Allocation Failure) 2017-08-06T05:07:48.458-0400: 1009176.472: [ParNew: 105008K->48K(118016K), 0.2303490 secs] 126643K->21683K(1035520K), 0.2323777 secs] [Times: user=0.41 sys=0.02, real=0.23 secs] 2017-08-06T05:57:55.654-0400: 1012183.669: [GC (Allocation Failure) 2017-08-06T05:57:55.654-0400: 1012183.669: [ParNew: 105008K->235K(118016K), 0.0199400 secs] 126643K->21871K(1035520K), 0.0204667 secs] [Times: user=0.02 sys=0.01, real=0.02 secs] 2017-08-06T06:49:05.117-0400: 1015253.137: [GC (Allocation Failure) 2017-08-06T06:49:05.123-0400: 1015253.138: [ParNew: 105195K->119K(118016K), 0.9774745 secs] 126831K->21754K(1035520K), 0.9937836 secs] [Times: user=2.11 sys=0.07, real=1.00 secs] 2017-08-06T07:39:14.251-0400: 1018262.266: [GC (Allocation Failure) 2017-08-06T07:39:14.251-0400: 1018262.266: [ParNew: 105079K->135K(118016K), 0.0207845 secs] 126714K->21770K(1035520K), 0.0213601 secs] [Times: user=0.05 sys=0.01, real=0.02 secs] 2017-08-06T08:30:26.800-0400: 1021334.816: [GC (Allocation Failure) 2017-08-06T08:30:26.802-0400: 1021334.816: [ParNew: 105095K->59K(118016K), 0.9293641 secs] 126730K->21695K(1035520K), 0.9408561 secs] [Times: user=2.01 sys=0.07, real=0.94 secs] 2017-08-06T09:20:33.854-0400: 1024341.869: [GC (Allocation Failure) 2017-08-06T09:20:33.855-0400: 1024341.870: [ParNew: 105019K->44K(118016K), 0.2959216 secs] 126655K->21679K(1035520K), 0.2974647 secs] [Times: user=0.59 sys=0.04, real=0.29 secs] 2017-08-06T10:10:37.633-0400: 1027345.648: [GC (Allocation Failure) 2017-08-06T10:10:37.634-0400: 1027345.649: [ParNew: 105004K->46K(118016K), 0.0149965 secs] 126639K->21682K(1035520K), 0.0155876 secs] [Times: user=0.03 sys=0.00, real=0.02 secs] 2017-08-06T10:47:43.095-0400: 1029571.110: [GC (Allocation Failure) 2017-08-06T10:47:43.096-0400: 1029571.110: [ParNew: 105006K->57K(118016K), 0.0712345 secs] 126642K->21692K(1035520K), 0.0718214 secs] [Times: user=0.06 sys=0.00, real=0.08 secs] 2017-08-06T11:37:57.429-0400: 1032585.444: [GC (Allocation Failure) 2017-08-06T11:37:57.430-0400: 1032585.445: [ParNew: 105017K->295K(118016K), 0.2710306 secs] 126652K->21931K(1035520K), 0.2716333 secs] [Times: user=0.21 sys=0.05, real=0.27 secs] 2017-08-06T12:28:04.772-0400: 1035592.786: [GC (Allocation Failure) 2017-08-06T12:28:04.777-0400: 1035592.791: [ParNew: 105255K->383K(118016K), 0.4854186 secs] 126891K->22019K(1035520K), 0.4940971 secs] [Times: user=1.12 sys=0.03, real=0.50 secs] 2017-08-06T13:27:15.034-0400: 1039143.049: [GC (Allocation Failure) 2017-08-06T13:27:15.035-0400: 1039143.049: [ParNew: 105343K->352K(118016K), 0.3020346 secs] 126979K->21988K(1035520K), 0.3026209 secs] [Times: user=0.33 sys=0.05, real=0.30 secs] 2017-08-06T14:26:24.543-0400: 1042692.557: [GC (Allocation Failure) 2017-08-06T14:26:24.543-0400: 1042692.558: [ParNew: 105312K->130K(118016K), 0.4631230 secs] 126948K->21765K(1035520K), 0.4644631 secs] [Times: user=0.85 sys=0.05, real=0.46 secs] 2017-08-06T15:14:35.205-0400: 1045583.220: [GC (Allocation Failure) 2017-08-06T15:14:35.206-0400: 1045583.221: [ParNew: 105090K->408K(118016K), 0.2881570 secs] 126725K->22043K(1035520K), 0.2896398 secs] [Times: user=0.46 sys=0.05, real=0.29 secs] 2017-08-06T16:12:43.928-0400: 1049071.942: [GC (Allocation Failure) 2017-08-06T16:12:43.928-0400: 1049071.943: [ParNew: 105368K->464K(118016K), 0.2305435 secs] 127003K->22100K(1035520K), 0.2328136 secs] [Times: user=0.72 sys=0.00, real=0.23 secs] 2017-08-06T17:10:52.397-0400: 1052560.413: [GC (Allocation Failure) 2017-08-06T17:10:52.399-0400: 1052560.414: [ParNew: 105424K->516K(118016K), 0.3561405 secs] 127060K->22151K(1035520K), 0.3581890 secs] [Times: user=0.66 sys=0.04, real=0.36 secs] 2017-08-06T18:09:00.306-0400: 1056048.322: [GC (Allocation Failure) 2017-08-06T18:09:00.308-0400: 1056048.323: [ParNew: 105476K->372K(118016K), 0.2722855 secs] 127111K->22008K(1035520K), 0.2745838 secs] [Times: user=0.08 sys=0.05, real=0.27 secs] 2017-08-06T19:06:09.788-0400: 1059477.802: [GC (Allocation Failure) 2017-08-06T19:06:09.788-0400: 1059477.803: [ParNew: 105332K->131K(118016K), 0.3400451 secs] 126968K->21766K(1035520K), 0.3406455 secs] [Times: user=0.55 sys=0.04, real=0.34 secs] 2017-08-06T20:04:18.281-0400: 1062966.296: [GC (Allocation Failure) 2017-08-06T20:04:18.282-0400: 1062966.297: [ParNew: 105091K->66K(118016K), 0.5572479 secs] 126726K->21701K(1035520K), 0.5603920 secs] [Times: user=1.06 sys=0.05, real=0.56 secs] 2017-08-06T20:52:24.675-0400: 1065852.689: [GC (Allocation Failure) 2017-08-06T20:52:24.675-0400: 1065852.690: [ParNew: 105026K->50K(118016K), 0.1193231 secs] 126661K->21685K(1035520K), 0.1199722 secs] [Times: user=0.02 sys=0.02, real=0.12 secs] Heap par new generation total 118016K, used 92193K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 87% used [0x00000000c0000000, 0x00000000c59fbd88, 0x00000000c6680000) from space 13056K, 0% used [0x00000000c7340000, 0x00000000c734c830, 0x00000000c8000000) to space 13056K, 0% used [0x00000000c6680000, 0x00000000c6680000, 0x00000000c7340000) concurrent mark-sweep generation total 917504K, used 21635K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 26313K, capacity 26628K, committed 27008K, reserved 1073152K class space used 2833K, capacity 2910K, committed 2944K, reserved 1048576K ==> /var/log/hadoop/hdfs/hadoop-hdfs-datanode-USGVLHDP01.Coveris.local.out.1 <== ulimit -a for user hdfs core file size (blocks, -c) unlimited data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 7421 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 128000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 65536 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-08-03 <== 2017-08-03 23:58:27,743 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:58:37,745 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.821b875f-4dca-418e-944a-ecac25ae2c4c dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:58:37,746 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.821b875f-4dca-418e-944a-ecac25ae2c4c dst=null perm=null proto=rpc 2017-08-03 23:58:37,748 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.821b875f-4dca-418e-944a-ecac25ae2c4c dst=null perm=null proto=rpc 2017-08-03 23:58:37,748 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:58:47,750 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.b5a87dbd-2766-4a38-b81f-dc43bd59f18e dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:58:47,753 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.b5a87dbd-2766-4a38-b81f-dc43bd59f18e dst=null perm=null proto=rpc 2017-08-03 23:58:47,754 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.b5a87dbd-2766-4a38-b81f-dc43bd59f18e dst=null perm=null proto=rpc 2017-08-03 23:58:47,754 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:58:57,756 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.e6010a01-0f93-4000-a2a5-895b1ef0731a dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:58:57,761 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.e6010a01-0f93-4000-a2a5-895b1ef0731a dst=null perm=null proto=rpc 2017-08-03 23:58:57,763 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.e6010a01-0f93-4000-a2a5-895b1ef0731a dst=null perm=null proto=rpc 2017-08-03 23:58:57,763 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:58:59,271 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-08-03 23:59:07,766 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.6f419ce5-9ff7-4355-b8b4-98c2e6bb72e2 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:59:07,768 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.6f419ce5-9ff7-4355-b8b4-98c2e6bb72e2 dst=null perm=null proto=rpc 2017-08-03 23:59:07,769 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.6f419ce5-9ff7-4355-b8b4-98c2e6bb72e2 dst=null perm=null proto=rpc 2017-08-03 23:59:07,769 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:59:17,771 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.7593b9ea-4d99-44c6-8760-33ba819550e2 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:59:17,773 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.7593b9ea-4d99-44c6-8760-33ba819550e2 dst=null perm=null proto=rpc 2017-08-03 23:59:17,776 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.7593b9ea-4d99-44c6-8760-33ba819550e2 dst=null perm=null proto=rpc 2017-08-03 23:59:17,777 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:59:23,405 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-08-03 23:59:27,781 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.ad5274a8-b97e-4ffc-b366-e011703d7f61 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:59:27,786 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.ad5274a8-b97e-4ffc-b366-e011703d7f61 dst=null perm=null proto=rpc 2017-08-03 23:59:27,789 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.ad5274a8-b97e-4ffc-b366-e011703d7f61 dst=null perm=null proto=rpc 2017-08-03 23:59:27,791 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:59:37,793 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.7a61c7f9-5c96-4dcc-89a6-389abde9c40b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:59:37,799 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.7a61c7f9-5c96-4dcc-89a6-389abde9c40b dst=null perm=null proto=rpc 2017-08-03 23:59:37,801 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.7a61c7f9-5c96-4dcc-89a6-389abde9c40b dst=null perm=null proto=rpc 2017-08-03 23:59:37,803 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:59:47,805 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.dfdf4cac-1cb4-4c86-9aa4-98d09e4fcf2b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:59:47,814 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.dfdf4cac-1cb4-4c86-9aa4-98d09e4fcf2b dst=null perm=null proto=rpc 2017-08-03 23:59:47,815 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.dfdf4cac-1cb4-4c86-9aa4-98d09e4fcf2b dst=null perm=null proto=rpc 2017-08-03 23:59:47,815 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:59:57,817 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.d2fe0bc9-5b48-41fe-b3e7-914b9eaea2a7 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-08-03 23:59:57,820 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.d2fe0bc9-5b48-41fe-b3e7-914b9eaea2a7 dst=null perm=null proto=rpc 2017-08-03 23:59:57,821 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.d2fe0bc9-5b48-41fe-b3e7-914b9eaea2a7 dst=null perm=null proto=rpc 2017-08-03 23:59:57,822 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-08-03 23:59:59,264 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/gc.log-201708070032 <== Java HotSpot(TM) 64-Bit Server VM (25.77-b03) for linux-amd64 JRE (1.8.0_77-b03), built on Mar 20 2016 22:00:46 by "java_re" with gcc 4.3.0 20080428 (Red Hat 4.3.0-8) Memory: 4k page, physical 2053820k(557788k free), swap 8241148k(7817000k free) CommandLine flags: -XX:CMSInitiatingOccupancyFraction=70 -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:InitialHeapSize=1073741824 -XX:MaxHeapSize=1073741824 -XX:MaxNewSize=134217728 -XX:MaxTenuringThreshold=6 -XX:NewSize=134217728 -XX:OldPLABSize=16 -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:OnOutOfMemoryError="/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node" -XX:ParallelGCThreads=8 -XX:+PrintGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseCMSInitiatingOccupancyOnly -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC 2017-08-07T00:32:11.587-0400: 1.078: [GC (Allocation Failure) 2017-08-07T00:32:11.587-0400: 1.078: [ParNew: 104960K->11989K(118016K), 0.0385929 secs] 104960K->11989K(1035520K), 0.0386844 secs] [Times: user=0.10 sys=0.01, real=0.03 secs] Heap par new generation total 118016K, used 20175K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 7% used [0x00000000c0000000, 0x00000000c07fe670, 0x00000000c6680000) from space 13056K, 91% used [0x00000000c7340000, 0x00000000c7ef55d8, 0x00000000c8000000) to space 13056K, 0% used [0x00000000c6680000, 0x00000000c6680000, 0x00000000c7340000) concurrent mark-sweep generation total 917504K, used 0K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 16164K, capacity 16382K, committed 16768K, reserved 1064960K class space used 1963K, capacity 2061K, committed 2176K, reserved 1048576K ==> /var/log/hadoop/hdfs/hdfs-audit.log.2017-07-27 <== 2017-07-27 23:58:36,534 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.ad7a3e33-fdbe-4bcc-bb5d-84824f6934d1 dst=null perm=null proto=rpc 2017-07-27 23:58:36,535 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.ad7a3e33-fdbe-4bcc-bb5d-84824f6934d1 dst=null perm=null proto=rpc 2017-07-27 23:58:36,535 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:58:46,136 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-27 23:58:46,536 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.063da8f2-0e48-4d74-a05b-0ada1797a5b2 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-27 23:58:46,538 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.063da8f2-0e48-4d74-a05b-0ada1797a5b2 dst=null perm=null proto=rpc 2017-07-27 23:58:46,539 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.063da8f2-0e48-4d74-a05b-0ada1797a5b2 dst=null perm=null proto=rpc 2017-07-27 23:58:46,539 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:58:56,540 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.c09e4c6b-cdb3-4e1e-9ba4-39bcc57fb9c6 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-27 23:58:56,542 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.c09e4c6b-cdb3-4e1e-9ba4-39bcc57fb9c6 dst=null perm=null proto=rpc 2017-07-27 23:58:56,543 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.c09e4c6b-cdb3-4e1e-9ba4-39bcc57fb9c6 dst=null perm=null proto=rpc 2017-07-27 23:58:56,543 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:58:58,773 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 2017-07-27 23:59:06,545 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.112caf71-07ee-4fc6-a2c3-b492bba722ed dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-27 23:59:06,547 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.112caf71-07ee-4fc6-a2c3-b492bba722ed dst=null perm=null proto=rpc 2017-07-27 23:59:06,547 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.112caf71-07ee-4fc6-a2c3-b492bba722ed dst=null perm=null proto=rpc 2017-07-27 23:59:06,548 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:59:10,216 INFO FSNamesystem.audit: allowed=true ugi=oozie (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/user/oozie/share/lib dst=null perm=null proto=rpc 2017-07-27 23:59:16,549 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.e5795cd5-1168-4a0f-ab01-0ee499f3015d dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-27 23:59:16,551 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.e5795cd5-1168-4a0f-ab01-0ee499f3015d dst=null perm=null proto=rpc 2017-07-27 23:59:16,552 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.e5795cd5-1168-4a0f-ab01-0ee499f3015d dst=null perm=null proto=rpc 2017-07-27 23:59:16,552 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:59:26,554 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.774e2ce4-692a-47ab-97c2-f000cbee693b dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-27 23:59:26,555 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.774e2ce4-692a-47ab-97c2-f000cbee693b dst=null perm=null proto=rpc 2017-07-27 23:59:26,556 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.774e2ce4-692a-47ab-97c2-f000cbee693b dst=null perm=null proto=rpc 2017-07-27 23:59:26,556 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:59:36,558 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.27ab6f8e-aa2e-49c4-b029-2b611b2069fd dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-27 23:59:36,559 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.27ab6f8e-aa2e-49c4-b029-2b611b2069fd dst=null perm=null proto=rpc 2017-07-27 23:59:36,560 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.27ab6f8e-aa2e-49c4-b029-2b611b2069fd dst=null perm=null proto=rpc 2017-07-27 23:59:36,560 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:59:46,116 INFO FSNamesystem.audit: allowed=true ugi=mapred (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/mr-history/tmp dst=null perm=null proto=rpc 2017-07-27 23:59:46,562 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.cb883a34-a311-4347-a15d-9001f475557e dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-27 23:59:46,564 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.cb883a34-a311-4347-a15d-9001f475557e dst=null perm=null proto=rpc 2017-07-27 23:59:46,565 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.cb883a34-a311-4347-a15d-9001f475557e dst=null perm=null proto=rpc 2017-07-27 23:59:46,565 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:59:56,566 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=create src=/spark-history/.3495b01f-706a-42c6-89d7-89d665c09302 dst=null perm=spark:hadoop:rw-r--r-- proto=rpc 2017-07-27 23:59:56,568 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=getfileinfo src=/spark-history/.3495b01f-706a-42c6-89d7-89d665c09302 dst=null perm=null proto=rpc 2017-07-27 23:59:56,568 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=delete src=/spark-history/.3495b01f-706a-42c6-89d7-89d665c09302 dst=null perm=null proto=rpc 2017-07-27 23:59:56,569 INFO FSNamesystem.audit: allowed=true ugi=spark (auth:SIMPLE) ip=/10.6.240.212 cmd=listStatus src=/spark-history dst=null perm=null proto=rpc 2017-07-27 23:59:58,761 INFO FSNamesystem.audit: allowed=true ugi=yarn (auth:SIMPLE) ip=/10.6.240.213 cmd=listStatus src=/ats/active dst=null perm=null proto=rpc callerContext=yarn_ats_server_v1_5 ==> /var/log/hadoop/hdfs/gc.log-201707251246 <== 2017-08-06T16:02:29.107-0400: 1048555.516: [GC (Allocation Failure) 2017-08-06T16:02:29.109-0400: 1048555.517: [ParNew: 107790K->2198K(118016K), 0.0523490 secs] 192193K->86601K(1035520K), 0.0537039 secs] [Times: user=0.20 sys=0.00, real=0.05 secs] 2017-08-06T16:12:43.827-0400: 1049170.236: [GC (Allocation Failure) 2017-08-06T16:12:43.828-0400: 1049170.237: [ParNew: 107158K->1750K(118016K), 0.0278074 secs] 191561K->86153K(1035520K), 0.0290352 secs] [Times: user=0.07 sys=0.01, real=0.03 secs] 2017-08-06T16:23:45.355-0400: 1049831.763: [GC (Allocation Failure) 2017-08-06T16:23:45.356-0400: 1049831.764: [ParNew: 106710K->2227K(118016K), 0.1178686 secs] 191113K->86631K(1035520K), 0.1189396 secs] [Times: user=0.39 sys=0.00, real=0.12 secs] 2017-08-06T16:33:53.489-0400: 1050439.898: [GC (Allocation Failure) 2017-08-06T16:33:53.490-0400: 1050439.899: [ParNew: 107187K->1754K(118016K), 0.0564037 secs] 191591K->86158K(1035520K), 0.0576223 secs] [Times: user=0.17 sys=0.00, real=0.06 secs] 2017-08-06T16:44:48.254-0400: 1051094.662: [GC (Allocation Failure) 2017-08-06T16:44:48.254-0400: 1051094.663: [ParNew: 106714K->2337K(118016K), 0.0313845 secs] 191118K->86740K(1035520K), 0.0317081 secs] [Times: user=0.08 sys=0.00, real=0.03 secs] 2017-08-06T16:55:42.084-0400: 1051748.493: [GC (Allocation Failure) 2017-08-06T16:55:42.085-0400: 1051748.493: [ParNew: 107297K->1948K(118016K), 0.0806512 secs] 191700K->86352K(1035520K), 0.0811196 secs] [Times: user=0.30 sys=0.01, real=0.08 secs] 2017-08-06T17:05:51.625-0400: 1052358.033: [GC (Allocation Failure) 2017-08-06T17:05:51.626-0400: 1052358.034: [ParNew: 106908K->2425K(118016K), 0.0180723 secs] 191312K->86829K(1035520K), 0.0191076 secs] [Times: user=0.04 sys=0.01, real=0.01 secs] 2017-08-06T17:16:53.423-0400: 1053019.831: [GC (Allocation Failure) 2017-08-06T17:16:53.424-0400: 1053019.833: [ParNew: 107212K->1490K(118016K), 0.0157989 secs] 191615K->85893K(1035520K), 0.0172233 secs] [Times: user=0.06 sys=0.00, real=0.02 secs] 2017-08-06T17:26:55.005-0400: 1053621.413: [GC (Allocation Failure) 2017-08-06T17:26:55.006-0400: 1053621.414: [ParNew: 106448K->1413K(118016K), 0.0162660 secs] 190852K->85816K(1035520K), 0.0173395 secs] [Times: user=0.05 sys=0.00, real=0.02 secs] 2017-08-06T17:37:56.398-0400: 1054282.806: [GC (Allocation Failure) 2017-08-06T17:37:56.399-0400: 1054282.807: [ParNew: 106098K->153K(118016K), 0.0142770 secs] 190501K->84556K(1035520K), 0.0157964 secs] [Times: user=0.03 sys=0.00, real=0.02 secs] 2017-08-06T17:47:57.721-0400: 1054884.130: [GC (Allocation Failure) 2017-08-06T17:47:57.722-0400: 1054884.131: [ParNew: 105113K->2303K(118016K), 0.0298942 secs] 189516K->86707K(1035520K), 0.0310245 secs] [Times: user=0.07 sys=0.00, real=0.03 secs] 2017-08-06T17:58:57.370-0400: 1055543.778: [GC (Allocation Failure) 2017-08-06T17:58:57.371-0400: 1055543.780: [ParNew: 107263K->1600K(118016K), 0.0155374 secs] 191667K->86003K(1035520K), 0.0169475 secs] [Times: user=0.05 sys=0.01, real=0.02 secs] 2017-08-06T18:09:00.264-0400: 1056146.672: [GC (Allocation Failure) 2017-08-06T18:09:00.265-0400: 1056146.673: [ParNew: 106560K->2338K(118016K), 0.0099914 secs] 190963K->86741K(1035520K), 0.0107771 secs] [Times: user=0.03 sys=0.00, real=0.01 secs] 2017-08-06T18:19:15.377-0400: 1056761.787: [GC (Allocation Failure) 2017-08-06T18:19:15.379-0400: 1056761.787: [ParNew: 107298K->1837K(118016K), 0.1453104 secs] 191701K->86241K(1035520K), 0.1474410 secs] [Times: user=0.09 sys=0.04, real=0.14 secs] 2017-08-06T18:30:03.348-0400: 1057409.756: [GC (Allocation Failure) 2017-08-06T18:30:03.348-0400: 1057409.756: [ParNew: 106797K->2830K(118016K), 0.0495224 secs] 191201K->87233K(1035520K), 0.0497065 secs] [Times: user=0.15 sys=0.00, real=0.05 secs] 2017-08-06T18:40:04.824-0400: 1058011.233: [GC (Allocation Failure) 2017-08-06T18:40:04.825-0400: 1058011.234: [ParNew: 107790K->3179K(118016K), 0.1702205 secs] 192193K->87582K(1035520K), 0.1713724 secs] [Times: user=0.65 sys=0.00, real=0.18 secs] 2017-08-06T18:51:06.764-0400: 1058673.173: [GC (Allocation Failure) 2017-08-06T18:51:06.765-0400: 1058673.173: [ParNew: 108138K->1753K(118016K), 0.0122295 secs] 192542K->86156K(1035520K), 0.0131630 secs] [Times: user=0.04 sys=0.00, real=0.02 secs] 2017-08-06T19:01:08.692-0400: 1059275.101: [GC (Allocation Failure) 2017-08-06T19:01:08.694-0400: 1059275.102: [ParNew: 106713K->2413K(118016K), 0.0161809 secs] 191116K->86817K(1035520K), 0.0174883 secs] [Times: user=0.05 sys=0.00, real=0.02 secs] 2017-08-06T19:11:17.135-0400: 1059883.544: [GC (Allocation Failure) 2017-08-06T19:11:17.137-0400: 1059883.546: [ParNew: 107373K->2010K(118016K), 0.0121683 secs] 191777K->86414K(1035520K), 0.0144359 secs] [Times: user=0.04 sys=0.00, real=0.02 secs] 2017-08-06T19:22:12.649-0400: 1060539.057: [GC (Allocation Failure) 2017-08-06T19:22:12.650-0400: 1060539.059: [ParNew: 106970K->1863K(118016K), 0.0083739 secs] 191374K->86266K(1035520K), 0.0096334 secs] [Times: user=0.02 sys=0.00, real=0.00 secs] 2017-08-06T19:32:14.028-0400: 1061140.436: [GC (Allocation Failure) 2017-08-06T19:32:14.029-0400: 1061140.437: [ParNew: 106823K->2650K(118016K), 0.0769214 secs] 191226K->87055K(1035520K), 0.0778792 secs] [Times: user=0.28 sys=0.00, real=0.08 secs] 2017-08-06T19:42:29.819-0400: 1061756.227: [GC (Allocation Failure) 2017-08-06T19:42:29.820-0400: 1061756.228: [ParNew: 107610K->2131K(118016K), 0.0126910 secs] 192015K->86536K(1035520K), 0.0136981 secs] [Times: user=0.04 sys=0.00, real=0.01 secs] 2017-08-06T19:53:16.921-0400: 1062403.330: [GC (Allocation Failure) 2017-08-06T19:53:16.923-0400: 1062403.331: [ParNew: 107091K->1640K(118016K), 0.0146804 secs] 191496K->86046K(1035520K), 0.0159657 secs] [Times: user=0.05 sys=0.00, real=0.02 secs] 2017-08-06T20:03:18.243-0400: 1063004.652: [GC (Allocation Failure) 2017-08-06T20:03:18.244-0400: 1063004.652: [ParNew: 106600K->2475K(118016K), 0.0102425 secs] 191006K->86881K(1035520K), 0.0106932 secs] [Times: user=0.02 sys=0.00, real=0.01 secs] 2017-08-06T20:13:19.853-0400: 1063606.261: [GC (Allocation Failure) 2017-08-06T20:13:19.855-0400: 1063606.263: [ParNew: 107435K->2920K(118016K), 0.0447934 secs] 191841K->87325K(1035520K), 0.0463459 secs] [Times: user=0.15 sys=0.00, real=0.05 secs] 2017-08-06T20:24:06.441-0400: 1064252.849: [GC (Allocation Failure) 2017-08-06T20:24:06.441-0400: 1064252.849: [ParNew: 107880K->1672K(118016K), 0.0165978 secs] 192285K->86077K(1035520K), 0.0172305 secs] [Times: user=0.04 sys=0.00, real=0.02 secs] 2017-08-06T20:34:22.334-0400: 1064868.742: [GC (Allocation Failure) 2017-08-06T20:34:22.334-0400: 1064868.743: [ParNew: 106632K->1656K(118016K), 0.0810089 secs] 191037K->86061K(1035520K), 0.0819226 secs] [Times: user=0.31 sys=0.00, real=0.08 secs] 2017-08-06T20:44:23.608-0400: 1065470.016: [GC (Allocation Failure) 2017-08-06T20:44:23.609-0400: 1065470.017: [ParNew: 106616K->2520K(118016K), 0.0225504 secs] 191021K->86925K(1035520K), 0.0232579 secs] [Times: user=0.02 sys=0.00, real=0.02 secs] 2017-08-06T20:54:25.044-0400: 1066071.452: [GC (Allocation Failure) 2017-08-06T20:54:25.045-0400: 1066071.453: [ParNew: 107480K->2170K(118016K), 0.0501134 secs] 191885K->86577K(1035520K), 0.0510522 secs] [Times: user=0.18 sys=0.01, real=0.05 secs] 2017-08-06T21:04:26.225-0400: 1066672.634: [GC (Allocation Failure) 2017-08-06T21:04:26.227-0400: 1066672.635: [ParNew: 107110K->1785K(118016K), 0.0182138 secs] 191517K->86192K(1035520K), 0.0199534 secs] [Times: user=0.04 sys=0.00, real=0.02 secs] 2017-08-06T21:15:23.077-0400: 1067329.485: [GC (Allocation Failure) 2017-08-06T21:15:23.078-0400: 1067329.486: [ParNew: 106745K->1312K(118016K), 0.1295100 secs] 191152K->85719K(1035520K), 0.1303377 secs] [Times: user=0.48 sys=0.00, real=0.13 secs] 2017-08-06T21:25:29.030-0400: 1067935.438: [GC (Allocation Failure) 2017-08-06T21:25:29.031-0400: 1067935.439: [ParNew: 106272K->1501K(118016K), 0.0172729 secs] 190679K->85908K(1035520K), 0.0181486 secs] [Times: user=0.05 sys=0.00, real=0.02 secs] Heap par new generation total 118016K, used 76853K [0x00000000c0000000, 0x00000000c8000000, 0x00000000c8000000) eden space 104960K, 71% used [0x00000000c0000000, 0x00000000c4996318, 0x00000000c6680000) from space 13056K, 11% used [0x00000000c6680000, 0x00000000c67f7440, 0x00000000c7340000) to space 13056K, 0% used [0x00000000c7340000, 0x00000000c7340000, 0x00000000c8000000) concurrent mark-sweep generation total 917504K, used 84407K [0x00000000c8000000, 0x0000000100000000, 0x0000000100000000) Metaspace used 48313K, capacity 49173K, committed 49388K, reserved 1093632K class space used 5297K, capacity 5512K, committed 5624K, reserved 1048576K Command failed after 1 tries