stderr: /var/lib/ambari-agent/data/errors-356.txt Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/ATLAS/0.1.0.2.3/package/scripts/metadata_server.py", line 175, in MetadataServer().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute method(env) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 943, in restart self.stop(env, upgrade_type=upgrade_type) File "/var/lib/ambari-agent/cache/common-services/ATLAS/0.1.0.2.3/package/scripts/metadata_server.py", line 139, in stop user=params.metadata_user, File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'source /usr/hdp/current/atlas-server/conf/atlas-env.sh; /usr/hdp/current/atlas-server/bin/atlas_stop.py' returned 255. Exception: [Errno 1] Operation not permitted Traceback (most recent call last): File "/usr/hdp/current/atlas-server/bin/atlas_stop.py", line 95, in returncode = main() File "/usr/hdp/current/atlas-server/bin/atlas_stop.py", line 59, in main os.kill(pid, SIGTERM) OSError: [Errno 1] Operation not permitted stdout: /var/lib/ambari-agent/data/output-356.txt 2018-04-23 14:34:34,351 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.4.0-91 -> 2.6.4.0-91 2018-04-23 14:34:34,354 - Using hadoop conf dir: /usr/hdp/2.6.4.0-91/hadoop/conf 2018-04-23 14:34:34,463 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.4.0-91 -> 2.6.4.0-91 2018-04-23 14:34:34,464 - Using hadoop conf dir: /usr/hdp/2.6.4.0-91/hadoop/conf 2018-04-23 14:34:34,465 - Group['livy'] {} 2018-04-23 14:34:34,466 - Group['spark'] {} 2018-04-23 14:34:34,466 - Group['ranger'] {} 2018-04-23 14:34:34,467 - Group['hdfs'] {} 2018-04-23 14:34:34,467 - Group['zeppelin'] {} 2018-04-23 14:34:34,467 - Group['hadoop'] {} 2018-04-23 14:34:34,467 - Group['users'] {} 2018-04-23 14:34:34,467 - Group['knox'] {} 2018-04-23 14:34:34,468 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,471 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,472 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,473 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,474 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-04-23 14:34:34,474 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,475 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,476 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-04-23 14:34:34,477 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger'], 'uid': None} 2018-04-23 14:34:34,478 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-04-23 14:34:34,479 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None} 2018-04-23 14:34:34,480 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,481 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,482 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-04-23 14:34:34,483 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,484 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,485 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None} 2018-04-23 14:34:34,486 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,487 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,488 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,489 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,490 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,491 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-04-23 14:34:34,492 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-04-23 14:34:34,494 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-04-23 14:34:34,515 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2018-04-23 14:34:34,516 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2018-04-23 14:34:34,517 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-04-23 14:34:34,518 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-04-23 14:34:34,519 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {} 2018-04-23 14:34:34,544 - call returned (0, '1002') 2018-04-23 14:34:34,545 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2018-04-23 14:34:34,566 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] due to not_if 2018-04-23 14:34:34,566 - Group['hdfs'] {} 2018-04-23 14:34:34,567 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']} 2018-04-23 14:34:34,567 - FS Type: 2018-04-23 14:34:34,568 - Directory['/etc/hadoop'] {'mode': 0755} 2018-04-23 14:34:34,581 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2018-04-23 14:34:34,582 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2018-04-23 14:34:34,595 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'} 2018-04-23 14:34:34,619 - Skipping Execute[('setenforce', '0')] due to not_if 2018-04-23 14:34:34,620 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'} 2018-04-23 14:34:34,622 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'} 2018-04-23 14:34:34,623 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'} 2018-04-23 14:34:34,627 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'} 2018-04-23 14:34:34,628 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'} 2018-04-23 14:34:34,634 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644} 2018-04-23 14:34:34,642 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2018-04-23 14:34:34,643 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755} 2018-04-23 14:34:34,644 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'} 2018-04-23 14:34:34,648 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644} 2018-04-23 14:34:34,669 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755} 2018-04-23 14:34:35,061 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.4.0-91 -> 2.6.4.0-91 2018-04-23 14:34:35,062 - Using hadoop conf dir: /usr/hdp/2.6.4.0-91/hadoop/conf 2018-04-23 14:34:35,063 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.4.0-91 -> 2.6.4.0-91 2018-04-23 14:34:35,068 - Execute['source /usr/hdp/current/atlas-server/conf/atlas-env.sh; /usr/hdp/current/atlas-server/bin/atlas_stop.py'] {'user': 'atlas'} 2018-04-23 14:34:35,151 - Execute['find /var/log/atlas -maxdepth 1 -type f -name '*' -exec echo '==> {} <==' \; -exec tail -n 40 {} \;'] {'logoutput': True, 'ignore_failures': True, 'user': 'atlas'} ==> /var/log/atlas/atlas.20180201-102756.out <== ==> /var/log/atlas/atlas.20180201-102756.err <== log4j:WARN Continuable parsing error 37 and column 14 log4j:WARN The content of element type "appender" must match "(errorHandler?,param*,rollingPolicy?,triggeringPolicy?,connectionSource?,layout?,filter*,appender-ref*)". log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender. log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender. ==> /var/log/atlas/audit.log <== ==> /var/log/atlas/application.log <== 2018-04-23 14:17:31,422 INFO - [main:] ~ Creating ArrayBlockingQueue with maxSize=1048576 (AuditBatchQueue:98) 2018-04-23 14:17:31,423 INFO - [main:] ~ Starting writerThread, queueName=atlas.async.batch, consumer=atlas.async.batch.hdfs (AuditFileSpool:304) 2018-04-23 14:17:31,431 INFO - [main:] ~ PolicyEngineOptions: { evaluatorType: auto, cacheAuditResult: true, disableContextEnrichers: false, disableCustomConditions: false, disableTrieLookupPrefilter: false } (RangerBasePlugin:151) 2018-04-23 14:17:31,432 INFO - [Ranger async Audit cleanup:] ~ RangerAsyncAuditCleanup: Waiting to audit cleanup start signal (AuditProviderFactory:497) 2018-04-23 14:17:34,101 INFO - [main:] ~ PolicyRefresher(serviceName=Sandbox_atlas): found updated version. lastKnownVersion=-1; newVersion=11 (PolicyRefresher:277) 2018-04-23 14:17:34,162 INFO - [main:] ~ resourceName=entity; optIgnoreCase=true; optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=1; wildcardEvaluatorListRefCount=0 (RangerResourceTrie:112) 2018-04-23 14:17:34,162 INFO - [main:] ~ resourceName=type; optIgnoreCase=true; optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=1; wildcardEvaluatorListRefCount=0 (RangerResourceTrie:112) 2018-04-23 14:17:34,163 INFO - [main:] ~ resourceName=operation; optIgnoreCase=true; optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=1; wildcardEvaluatorListRefCount=0 (RangerResourceTrie:112) 2018-04-23 14:17:34,163 INFO - [main:] ~ resourceName=taxonomy; optIgnoreCase=true; optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=1; wildcardEvaluatorListRefCount=0 (RangerResourceTrie:112) 2018-04-23 14:17:34,163 INFO - [main:] ~ resourceName=term; optIgnoreCase=true; optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0; wildcardEvaluatorListCount=1; evaluatorListRefCount=1; wildcardEvaluatorListRefCount=0 (RangerResourceTrie:112) 2018-04-23 14:17:34,164 INFO - [main:] ~ Policies will NOT be reordered based on number of evaluations (RangerBasePlugin:181) 2018-04-23 14:17:34,621 INFO - [main:] ~ Creating filter chain: Ant [pattern='/login.jsp'], [] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,621 INFO - [main:] ~ Creating filter chain: Ant [pattern='/css/**'], [] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,621 INFO - [main:] ~ Creating filter chain: Ant [pattern='/img/**'], [] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,621 INFO - [main:] ~ Creating filter chain: Ant [pattern='/libs/**'], [] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,622 INFO - [main:] ~ Creating filter chain: Ant [pattern='/js/**'], [] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,622 INFO - [main:] ~ Creating filter chain: Ant [pattern='/ieerror.html'], [] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,622 INFO - [main:] ~ Creating filter chain: Ant [pattern='/api/atlas/admin/status'], [] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,622 INFO - [main:] ~ Creating filter chain: Ant [pattern='/api/atlas/admin/metrics'], [] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,776 INFO - [main:] ~ Creating filter chain: org.springframework.security.web.util.matcher.AnyRequestMatcher@1, [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@4cc032f5, org.springframework.security.web.context.SecurityContextPersistenceFilter@37ebbd0, org.springframework.security.web.authentication.logout.LogoutFilter@46fd5095, org.springframework.security.web.authentication.UsernamePasswordAuthenticationFilter@7af9ee4e, org.apache.atlas.web.filters.AtlasKnoxSSOAuthenticationFilter@797e2ebd, org.springframework.security.web.authentication.www.BasicAuthenticationFilter@6512680c, org.apache.atlas.web.filters.StaleTransactionCleanupFilter@40cfbb2f, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@6a4e7a7d, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@5cb7e69d, org.apache.atlas.web.filters.AtlasAuthenticationFilter@7e6ac9f4, org.apache.atlas.web.filters.AtlasCSRFPreventionFilter@33701b56, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@3bf3c043, org.springframework.security.web.session.SessionManagementFilter@32a0f4a4, org.springframework.security.web.access.ExceptionTranslationFilter@20394260, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@371ad888, org.apache.atlas.web.filters.AtlasAuthorizationFilter@65812d1] (DefaultSecurityFilterChain:43) 2018-04-23 14:17:34,864 INFO - [main:] ~ Root WebApplicationContext: initialization completed in 28058 ms (ContextLoader:344) 2018-04-23 14:17:34,869 INFO - [main:] ~ AuditFilter initialization started (AuditFilter:58) 2018-04-23 14:17:35,000 INFO - [main:] ~ Using default applicationContext (SpringServlet:136) 2018-04-23 14:17:35,002 INFO - [main:] ~ Registering Spring bean, allExceptionMapper, of type org.apache.atlas.web.errors.AllExceptionMapper as a provider class (SpringComponentProviderFactory:106) 2018-04-23 14:17:35,002 INFO - [main:] ~ Registering Spring bean, atlasBaseExceptionMapper, of type org.apache.atlas.web.errors.AtlasBaseExceptionMapper as a provider class (SpringComponentProviderFactory:106) 2018-04-23 14:17:35,002 INFO - [main:] ~ Registering Spring bean, notFoundExceptionMapper, of type org.apache.atlas.web.errors.NotFoundExceptionMapper as a provider class (SpringComponentProviderFactory:106) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, discoveryREST, of type org.apache.atlas.web.rest.DiscoveryREST as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, typesREST, of type org.apache.atlas.web.rest.TypesREST as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, lineageREST, of type org.apache.atlas.web.rest.LineageREST as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, entityREST, of type org.apache.atlas.web.rest.EntityREST as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, entityResource, of type org.apache.atlas.web.resources.EntityResource as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, taxonomyService, of type org.apache.atlas.web.resources.TaxonomyService as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, adminResource, of type org.apache.atlas.web.resources.AdminResource as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, metadataDiscoveryResource, of type org.apache.atlas.web.resources.MetadataDiscoveryResource as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,003 INFO - [main:] ~ Registering Spring bean, typesResource, of type org.apache.atlas.web.resources.TypesResource as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,004 INFO - [main:] ~ Registering Spring bean, entityService, of type org.apache.atlas.web.resources.EntityService as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,004 INFO - [main:] ~ Registering Spring bean, lineageResource, of type org.apache.atlas.web.resources.LineageResource as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,004 INFO - [main:] ~ Registering Spring bean, dataSetLineageResource, of type org.apache.atlas.web.resources.DataSetLineageResource as a root resource class (SpringComponentProviderFactory:111) 2018-04-23 14:17:35,007 INFO - [main:] ~ Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM' (WebApplicationImpl:815) 2018-04-23 14:17:35,743 INFO - [main:] ~ Started SelectChannelConnector@0.0.0.0:21000 (AbstractConnector:338) ==> /var/log/atlas/atlas.20180423-141656.err <== log4j:WARN Continuable parsing error 37 and column 14 log4j:WARN The content of element type "appender" must match "(errorHandler?,param*,rollingPolicy?,triggeringPolicy?,connectionSource?,layout?,filter*,appender-ref*)". log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender. log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender. ==> /var/log/atlas/atlas.20180423-141656.out <== Command failed after 1 tries