INFO 2016-10-31 21:31:26,546 main.py:90 - loglevel=logging.INFO INFO 2016-10-31 21:31:26,546 main.py:90 - loglevel=logging.INFO INFO 2016-10-31 21:31:26,546 main.py:90 - loglevel=logging.INFO INFO 2016-10-31 21:31:26,548 ExitHelper.py:53 - Performing cleanup before exiting... INFO 2016-10-31 21:31:30,215 main.py:90 - loglevel=logging.INFO INFO 2016-10-31 21:31:30,216 main.py:90 - loglevel=logging.INFO INFO 2016-10-31 21:31:30,216 main.py:90 - loglevel=logging.INFO INFO 2016-10-31 21:31:30,217 DataCleaner.py:39 - Data cleanup thread started INFO 2016-10-31 21:31:30,218 DataCleaner.py:120 - Data cleanup started INFO 2016-10-31 21:31:30,219 DataCleaner.py:122 - Data cleanup finished INFO 2016-10-31 21:31:30,252 PingPortListener.py:50 - Ping port listener started on port: 8670 INFO 2016-10-31 21:31:30,254 main.py:349 - Connecting to Ambari server at https://sandbox.hortonworks.com:8440 (127.0.0.1) INFO 2016-10-31 21:31:30,254 NetUtil.py:62 - Connecting to https://sandbox.hortonworks.com:8440/ca INFO 2016-10-31 21:31:30,316 main.py:359 - Connected to Ambari server sandbox.hortonworks.com WARNING 2016-10-31 21:31:30,316 ClusterConfiguration.py:71 - Unable to load configurations from /var/lib/ambari-agent/cache/cluster_configuration/configurations.json. This file will be regenerated on registration INFO 2016-10-31 21:31:30,317 Controller.py:542 - Exception in move_data_dir_mount_file(). Error: invalid literal for int() with base 10: '' INFO 2016-10-31 21:31:30,317 threadpool.py:52 - Started thread pool with 3 core threads and 20 maximum threads WARNING 2016-10-31 21:31:30,317 AlertSchedulerHandler.py:261 - [AlertScheduler] /var/lib/ambari-agent/cache/alerts/definitions.json not found or invalid. No alerts will be scheduled until registration occurs. INFO 2016-10-31 21:31:30,317 AlertSchedulerHandler.py:156 - [AlertScheduler] Starting ; currently running: False INFO 2016-10-31 21:31:32,322 hostname.py:98 - Read public hostname 'localhost' using socket.getfqdn() INFO 2016-10-31 21:31:32,340 logger.py:71 - call[['test', '-w', '/']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,368 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,368 logger.py:71 - call[['test', '-w', '/']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,384 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,384 logger.py:71 - call[['test', '-w', '/dev']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,396 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,397 logger.py:71 - call[['test', '-w', '/sys/fs/cgroup']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,410 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,410 logger.py:71 - call[['test', '-w', '/hadoop']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,422 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,422 logger.py:71 - call[['test', '-w', '/etc/resolv.conf']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,436 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,436 logger.py:71 - call[['test', '-w', '/etc/hostname']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,447 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,447 logger.py:71 - call[['test', '-w', '/etc/hosts']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,458 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,459 logger.py:71 - call[['test', '-w', '/dev/shm']] {'sudo': True, 'timeout': 5} INFO 2016-10-31 21:31:32,469 logger.py:71 - call returned (0, '') INFO 2016-10-31 21:31:32,482 Facter.py:194 - Directory: '/etc/resource_overrides' does not exist - it won't be used for gathering system resources. INFO 2016-10-31 21:31:32,560 Controller.py:160 - Registering with localhost (127.0.0.1) (agent='{"hardwareProfile": {"kernel": "Linux", "domain": "", "physicalprocessorcount": 1, "kernelrelease": "3.10.0-327.el7.x86_64", "uptime_days": "0", "memorytotal": 8011268, "swapfree": "4.88 GB", "memorysize": 8011268, "osfamily": "redhat", "swapsize": "4.88 GB", "processorcount": 1, "netmask": "255.0.0.0", "timezone": "UTC", "hardwareisa": "x86_64", "memoryfree": 2121948, "operatingsystem": "centos", "kernelmajversion": "3.10", "kernelversion": "3.10.0", "macaddress": "02:42:AC:11:00:02", "operatingsystemrelease": "6.8", "ipaddress": "127.0.0.1", "hostname": "localhost", "uptime_hours": "0", "fqdn": "localhost", "id": "root", "architecture": "x86_64", "selinux": false, "mounts": [{"available": "24336300", "used": "18444736", "percent": "44%", "device": "rootfs", "mountpoint": "/", "type": "rootfs", "size": "45094812"}, {"available": "24336300", "used": "18444736", "percent": "44%", "device": "overlay", "mountpoint": "/", "type": "overlay", "size": "45094812"}, {"available": "4005632", "used": "0", "percent": "0%", "device": "tmpfs", "mountpoint": "/dev", "type": "tmpfs", "size": "4005632"}, {"available": "4005632", "used": "0", "percent": "0%", "device": "tmpfs", "mountpoint": "/sys/fs/cgroup", "type": "tmpfs", "size": "4005632"}, {"available": "24336300", "used": "18444736", "percent": "44%", "device": "/dev/sda3", "mountpoint": "/hadoop", "type": "ext4", "size": "45094812"}, {"available": "24336300", "used": "18444736", "percent": "44%", "device": "/dev/sda3", "mountpoint": "/etc/resolv.conf", "type": "ext4", "size": "45094812"}, {"available": "24336300", "used": "18444736", "percent": "44%", "device": "/dev/sda3", "mountpoint": "/etc/hostname", "type": "ext4", "size": "45094812"}, {"available": "24336300", "used": "18444736", "percent": "44%", "device": "/dev/sda3", "mountpoint": "/etc/hosts", "type": "ext4", "size": "45094812"}, {"available": "65536", "used": "0", "percent": "0%", "device": "shm", "mountpoint": "/dev/shm", "type": "tmpfs", "size": "65536"}], "hardwaremodel": "x86_64", "uptime_seconds": "598", "interfaces": "eth0,lo"}, "currentPingPort": 8670, "prefix": "/var/lib/ambari-agent/data", "agentVersion": "", "agentEnv": {"transparentHugePage": "", "hostHealth": {"agentTimeStampAtReporting": 1477949492552, "activeJavaProcs": [{"command": "java -Dproc_rangeradmin -XX:MaxPermSize=256m -Xmx1024m -Xms1024m -Dlogdir=/var/log/ranger/admin -Dcatalina.base=/usr/hdp/2.5.0.0-1245/ranger-admin/ews -cp /usr/hdp/2.5.0.0-1245/ranger-admin/ews/webapp/WEB-INF/classes/conf:/usr/hdp/2.5.0.0-1245/ranger-admin/ews/lib/*:/usr/hdp/2.5.0.0-1245/ranger-admin/ews/ranger_jaas/*:/usr/hdp/2.5.0.0-1245/ranger-admin/ews/webapp/WEB-INF/classes/conf/ranger_jaas:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/lib/*:/*: org.apache.ranger.server.tomcat.EmbeddedServer", "pid": 88, "hadoop": false, "user": "ranger"}, {"command": "/usr/lib/jvm/java/bin/java -Dproc_datanode -Xmx250m -Dhdp.version=2.5.0.0-1245 -Djava.net.preferIPv4Stack=true -Dhdp.version= -Djava.net.preferIPv4Stack=true -Dhdp.version= -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop/hdfs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str=hdfs -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.0.0-1245 -Dhadoop.log.dir=/var/log/hadoop/hdfs -Dhadoop.log.file=hadoop-hdfs-datanode-sandbox.hortonworks.com.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str=hdfs -Dhadoop.root.logger=INFO,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -server -server -XX:ParallelGCThreads=4 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:NewSize=200m -XX:MaxNewSize=200m -XX:PermSize=128m -XX:MaxPermSize=256m -Xloggc:/var/log/hadoop/hdfs/gc.log-201610312121 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xms250m -Xmx250m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -server -XX:ParallelGCThreads=4 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:NewSize=200m -XX:MaxNewSize=200m -XX:PermSize=128m -XX:MaxPermSize=256m -Xloggc:/var/log/hadoop/hdfs/gc.log-201610312121 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xms250m -Xmx250m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -server -XX:ParallelGCThreads=4 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:NewSize=200m -XX:MaxNewSize=200m -XX:PermSize=128m -XX:MaxPermSize=256m -Xloggc:/var/log/hadoop/hdfs/gc.log-201610312121 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xms250m -Xmx250m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -Dhadoop.security.logger=INFO,RFAS org.apache.hadoop.hdfs.server.datanode.DataNode", "pid": 656, "hadoop": true, "user": "hdfs"}, {"command": "/usr/lib/jvm/java/bin/java -Dproc_namenode -Xmx250m -Dhdp.version=2.5.0.0-1245 -Djava.net.preferIPv4Stack=true -Dhdp.version= -Djava.net.preferIPv4Stack=true -Dhdp.version= -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop/hdfs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str=hdfs -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.0.0-1245 -Dhadoop.log.dir=/var/log/hadoop/hdfs -Dhadoop.log.file=hadoop-hdfs-namenode-sandbox.hortonworks.com.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str=hdfs -Dhadoop.root.logger=INFO,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:NewSize=50m -XX:MaxNewSize=100m -XX:PermSize=128m -XX:MaxPermSize=256m -Xloggc:/var/log/hadoop/hdfs/gc.log-201610312121 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -XX:CMSInitiatingOccupancyFraction=70 -XX:+UseCMSInitiatingOccupancyOnly -Xms250m -Xmx250m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -XX:OnOutOfMemoryError=\\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\\" -Dorg.mortbay.jetty.Request.maxFormContentSize=-1 -server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:NewSize=50m -XX:MaxNewSize=100m -XX:PermSize=128m -XX:MaxPermSize=256m -Xloggc:/var/log/hadoop/hdfs/gc.log-201610312121 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -XX:CMSInitiatingOccupancyFraction=70 -XX:+UseCMSInitiatingOccupancyOnly -Xms250m -Xmx250m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -XX:OnOutOfMemoryError=\\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\\" -Dorg.mortbay.jetty.Request.maxFormContentSize=-1 -server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log -XX:NewSize=50m -XX:MaxNewSize=100m -XX:PermSize=128m -XX:MaxPermSize=256m -Xloggc:/var/log/hadoop/hdfs/gc.log-201610312121 -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -XX:CMSInitiatingOccupancyFraction=70 -XX:+UseCMSInitiatingOccupancyOnly -Xms250m -Xmx250m -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT -XX:OnOutOfMemoryError=\\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\\" -Dorg.mortbay.jetty.Request.maxFormContentSize=-1 -Dhadoop.security.logger=INFO,RFAS org.apache.hadoop.hdfs.server.namenode.NameNode", "pid": 658, "hadoop": true, "user": "hdfs"}, {"command": "java -Dproc_rangerusersync -Dlog4j.configuration=file:/etc/ranger/usersync/conf/log4j.properties -Dlogdir=/var/log/ranger/usersync -cp /usr/hdp/2.5.0.0-1245/ranger-usersync/dist/*:/usr/hdp/2.5.0.0-1245/ranger-usersync/lib/*:/usr/hdp/2.5.0.0-1245/ranger-usersync/conf:/* org.apache.ranger.authentication.UnixAuthenticationService -enableUnixAuth", "pid": 772, "hadoop": false, "user": "ranger"}, {"command": "/usr/lib/jvm/java/bin/java -Dzookeeper.log.dir=/var/log/zookeeper -Dzookeeper.log.file=zookeeper-zookeeper-server-sandbox.hortonworks.com.log -Dzookeeper.root.logger=INFO,ROLLINGFILE -cp /usr/hdp/current/zookeeper-server/bin/../build/classes:/usr/hdp/current/zookeeper-server/bin/../build/lib/*.jar:/usr/hdp/current/zookeeper-server/bin/../lib/xercesMinimal-1.9.6.2.jar:/usr/hdp/current/zookeeper-server/bin/../lib/wagon-provider-api-2.4.jar:/usr/hdp/current/zookeeper-server/bin/../lib/wagon-http-shared4-2.4.jar:/usr/hdp/current/zookeeper-server/bin/../lib/wagon-http-shared-1.0-beta-6.jar:/usr/hdp/current/zookeeper-server/bin/../lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/hdp/current/zookeeper-server/bin/../lib/wagon-http-2.4.jar:/usr/hdp/current/zookeeper-server/bin/../lib/wagon-file-1.0-beta-6.jar:/usr/hdp/current/zookeeper-server/bin/../lib/slf4j-log4j12-1.6.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/slf4j-api-1.6.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/plexus-utils-3.0.8.jar:/usr/hdp/current/zookeeper-server/bin/../lib/plexus-interpolation-1.11.jar:/usr/hdp/current/zookeeper-server/bin/../lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/netty-3.7.0.Final.jar:/usr/hdp/current/zookeeper-server/bin/../lib/nekohtml-1.9.6.2.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-settings-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-repository-metadata-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-project-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-profile-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-plugin-registry-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-model-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-error-diagnostics-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-artifact-manager-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-artifact-2.2.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/maven-ant-tasks-2.1.3.jar:/usr/hdp/current/zookeeper-server/bin/../lib/log4j-1.2.16.jar:/usr/hdp/current/zookeeper-server/bin/../lib/jsoup-1.7.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/jline-0.9.94.jar:/usr/hdp/current/zookeeper-server/bin/../lib/commons-logging-1.1.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/commons-io-2.2.jar:/usr/hdp/current/zookeeper-server/bin/../lib/commons-codec-1.6.jar:/usr/hdp/current/zookeeper-server/bin/../lib/classworlds-1.1-alpha-2.jar:/usr/hdp/current/zookeeper-server/bin/../lib/backport-util-concurrent-3.1.jar:/usr/hdp/current/zookeeper-server/bin/../lib/ant-launcher-1.8.0.jar:/usr/hdp/current/zookeeper-server/bin/../lib/ant-1.8.0.jar:/usr/hdp/current/zookeeper-server/bin/../zookeeper-3.4.6.2.5.0.0-1245.jar:/usr/hdp/current/zookeeper-server/bin/../src/java/lib/*.jar:/etc/zookeeper/conf::/usr/share/zookeeper/*:/usr/share/zookeeper/* -Xmx1024m -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.local.only=false org.apache.zookeeper.server.quorum.QuorumPeerMain /etc/zookeeper/conf/zoo.cfg", "pid": 902, "hadoop": true, "user": "zookeeper"}, {"command": "/usr/lib/jvm/java/bin/java -Dproc_portmap -Xmx250m -Dhdp.version=2.5.0.0-1245 -Djava.net.preferIPv4Stack=true -Dhdp.version= -Djava.net.preferIPv4Stack=true -Dhdp.version= -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop/ -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str= -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.0.0-1245 -Dhadoop.log.dir=/var/log/hadoop/ -Dhadoop.log.file=hadoop--portmap-sandbox.hortonworks.com.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str= -Dhadoop.root.logger=INFO,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,RFAS org.apache.hadoop.portmap.Portmap", "pid": 1124, "hadoop": true, "user": "root"}, {"command": "/usr/lib/jvm/java/bin/java -Djava.util.logging.config.file=/usr/hdp/current/oozie-server/oozie-server/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager -Dhdp.version=2.5.0.0-1245 -Xmx1024m -XX:MaxPermSize=512m -Xmx1024m -XX:MaxPermSize=512m -Dderby.stream.error.file=/var/log/oozie/derby.log -Doozie.home.dir=/usr/hdp/2.5.0.0-1245/oozie -Doozie.config.dir=/usr/hdp/current/oozie-server/conf -Doozie.log.dir=/var/log/oozie -Doozie.data.dir=/hadoop/oozie/data -Doozie.instance.id=sandbox.hortonworks.com -Doozie.config.file=oozie-site.xml -Doozie.log4j.file=oozie-log4j.properties -Doozie.log4j.reload=10 -Doozie.http.hostname=sandbox.hortonworks.com -Doozie.admin.port=11001 -Doozie.http.port=11000 -Doozie.https.port=11443 -Doozie.base.url=http://sandbox.hortonworks.com:11000/oozie -Doozie.https.keystore.file=/home/oozie/.keystore -Doozie.https.keystore.pass=password -Djava.library.path=/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64 -Djava.endorsed.dirs=/usr/lib/bigtop-tomcat/endorsed -classpath /usr/lib/bigtop-tomcat/bin/bootstrap.jar -Dcatalina.base=/usr/hdp/current/oozie-server/oozie-server -Dcatalina.home=/usr/lib/bigtop-tomcat -Djava.io.tmpdir=/var/tmp/oozie org.apache.catalina.startup.Bootstrap start", "pid": 1246, "hadoop": true, "user": "oozie"}, {"command": "/usr/lib/jvm/java/bin/java -Dproc_resourcemanager -Xmx250m -Dhdp.version=2.5.0.0-1245 -Dhadoop.log.dir=/var/log/hadoop-yarn/yarn -Dyarn.log.dir=/var/log/hadoop-yarn/yarn -Dhadoop.log.file=yarn-yarn-resourcemanager-sandbox.hortonworks.com.log -Dyarn.log.file=yarn-yarn-resourcemanager-sandbox.hortonworks.com.log -Dyarn.home.dir= -Dyarn.id.str=yarn -Dhadoop.root.logger=INFO,EWMA,RFA -Dyarn.root.logger=INFO,EWMA,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -Dyarn.policy.file=hadoop-policy.xml -Djava.io.tmpdir=/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -Dyarn.server.resourcemanager.appsummary.logger=INFO,RMSUMMARY -Drm.audit.logger=INFO,RMAUDIT -Dhadoop.log.dir=/var/log/hadoop-yarn/yarn -Dyarn.log.dir=/var/log/hadoop-yarn/yarn -Dhadoop.log.file=yarn-yarn-resourcemanager-sandbox.hortonworks.com.log -Dyarn.log.file=yarn-yarn-resourcemanager-sandbox.hortonworks.com.log -Dyarn.home.dir=/usr/hdp/current/hadoop-yarn-resourcemanager -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.root.logger=INFO,EWMA,RFA -Dyarn.root.logger=INFO,EWMA,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -classpath /etc/hadoop/conf:/etc/hadoop/conf:/etc/hadoop/conf:/usr/hdp/2.5.0.0-1245/hadoop/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/.//*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/./:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/.//*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/.//*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/.//*::jdbc-mysql.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-java.jar:/usr/hdp/2.5.0.0-1245/tez/*:/usr/hdp/2.5.0.0-1245/tez/lib/*:/usr/hdp/2.5.0.0-1245/tez/conf:jdbc-mysql.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-java.jar:/usr/hdp/2.5.0.0-1245/tez/*:/usr/hdp/2.5.0.0-1245/tez/lib/*:/usr/hdp/2.5.0.0-1245/tez/conf:/usr/hdp/current/hadoop-yarn-resourcemanager/.//*:/usr/hdp/current/hadoop-yarn-resourcemanager/lib/*:/etc/hadoop/conf/rm-config/log4j.properties org.apache.hadoop.yarn.server.resourcemanager.ResourceManager", "pid": 1614, "hadoop": true, "user": "yarn"}, {"command": "/usr/lib/jvm/java/bin/java -Dproc_historyserver -Xmx250m -Dhdp.version=2.5.0.0-1245 -Dhadoop.log.dir=/var/log/hadoop-yarn/yarn -Dyarn.log.dir=/var/log/hadoop-yarn/yarn -Dhadoop.log.file=yarn-yarn-historyserver-sandbox.hortonworks.com.log -Dyarn.log.file=yarn-yarn-historyserver-sandbox.hortonworks.com.log -Dyarn.home.dir= -Dyarn.id.str=yarn -Dhadoop.root.logger=INFO,EWMA,RFA -Dyarn.root.logger=INFO,EWMA,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -Dyarn.policy.file=hadoop-policy.xml -Djava.io.tmpdir=/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -Dhadoop.log.dir=/var/log/hadoop-yarn/yarn -Dyarn.log.dir=/var/log/hadoop-yarn/yarn -Dhadoop.log.file=yarn-yarn-historyserver-sandbox.hortonworks.com.log -Dyarn.log.file=yarn-yarn-historyserver-sandbox.hortonworks.com.log -Dyarn.home.dir=/usr/hdp/current/hadoop-yarn-resourcemanager -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.root.logger=INFO,EWMA,RFA -Dyarn.root.logger=INFO,EWMA,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -classpath /etc/hadoop/conf:/etc/hadoop/conf:/etc/hadoop/conf:/usr/hdp/2.5.0.0-1245/hadoop/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/.//*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/./:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/.//*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/.//*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/.//*::jdbc-mysql.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-java.jar:/usr/hdp/2.5.0.0-1245/tez/*:/usr/hdp/2.5.0.0-1245/tez/lib/*:/usr/hdp/2.5.0.0-1245/tez/conf:jdbc-mysql.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-java.jar:/usr/hdp/2.5.0.0-1245/tez/*:/usr/hdp/2.5.0.0-1245/tez/lib/*:/usr/hdp/2.5.0.0-1245/tez/conf:/usr/hdp/current/hadoop-yarn-resourcemanager/.//*:/usr/hdp/current/hadoop-yarn-resourcemanager/lib/*:/etc/hadoop/conf/ahs-config/log4j.properties org.apache.hadoop.yarn.server.applicationhistoryservice.ApplicationHistoryServer", "pid": 1631, "hadoop": true, "user": "yarn"}, {"command": "/usr/lib/jvm/java/bin/java -Dproc_nodemanager -Xmx512m -Dhdp.version=2.5.0.0-1245 -Dhadoop.log.dir=/var/log/hadoop-yarn/yarn -Dyarn.log.dir=/var/log/hadoop-yarn/yarn -Dhadoop.log.file=yarn-yarn-nodemanager-sandbox.hortonworks.com.log -Dyarn.log.file=yarn-yarn-nodemanager-sandbox.hortonworks.com.log -Dyarn.home.dir= -Dyarn.id.str=yarn -Dhadoop.root.logger=INFO,EWMA,RFA -Dyarn.root.logger=INFO,EWMA,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -Dyarn.policy.file=hadoop-policy.xml -Djava.io.tmpdir=/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -server -Dnm.audit.logger=INFO,NMAUDIT -Dnm.audit.logger=INFO,NMAUDIT -Dhadoop.log.dir=/var/log/hadoop-yarn/yarn -Dyarn.log.dir=/var/log/hadoop-yarn/yarn -Dhadoop.log.file=yarn-yarn-nodemanager-sandbox.hortonworks.com.log -Dyarn.log.file=yarn-yarn-nodemanager-sandbox.hortonworks.com.log -Dyarn.home.dir=/usr/hdp/current/hadoop-yarn-resourcemanager -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.root.logger=INFO,EWMA,RFA -Dyarn.root.logger=INFO,EWMA,RFA -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -classpath /etc/hadoop/conf:/etc/hadoop/conf:/etc/hadoop/conf:/usr/hdp/2.5.0.0-1245/hadoop/lib/*:/usr/hdp/2.5.0.0-1245/hadoop/.//*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/./:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-hdfs/.//*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-yarn/.//*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/lib/*:/usr/hdp/2.5.0.0-1245/hadoop-mapreduce/.//*::jdbc-mysql.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-java.jar:/usr/hdp/2.5.0.0-1245/tez/*:/usr/hdp/2.5.0.0-1245/tez/lib/*:/usr/hdp/2.5.0.0-1245/tez/conf:jdbc-mysql.jar:mysql-connector-java-5.1.17.jar:mysql-connector-java-5.1.37.jar:mysql-connector-java.jar:/usr/hdp/2.5.0.0-1245/tez/*:/usr/hdp/2.5.0.0-1245/tez/lib/*:/usr/hdp/2.5.0.0-1245/tez/conf:/usr/hdp/current/hadoop-yarn-resourcemanager/.//*:/usr/hdp/current/hadoop-yarn-resourcemanager/lib/*:/etc/hadoop/conf/nm-config/log4j.properties org.apache.hadoop.yarn.server.nodemanager.NodeManager", "pid": 1678, "hadoop": true, "user": "yarn"}, {"command": "/usr/lib/jvm/java/bin/java -Xmx250m -Dhdp.version=2.5.0.0-1245 -Djava.net.preferIPv4Stack=true -Dwebhcat.log.dir=/var/log/webhcat/ -Dlog4j.configuration=file:///usr/hdp/2.5.0.0-1245/hive-hcatalog/sbin/../etc/webhcat/webhcat-log4j.properties -Dhdp.version=2.5.0.0-1245 -Dhadoop.log.dir=/var/log/hadoop/hcat -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str=hcat -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx250m -XX:MaxPermSize=512m -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.5.0.0-1245/hive-hcatalog/sbin/../share/webhcat/svr/lib/hive-webhcat-1.2.1000.2.5.0.0-1245.jar org.apache.hive.hcatalog.templeton.Main", "pid": 1896, "hadoop": true, "user": "hcat"}, {"command": "/usr/lib/jvm/java/bin/java -Dhdp.version=2.5.0.0-1245 -cp /usr/hdp/2.5.0.0-1245/spark/sbin/../conf/:/usr/hdp/2.5.0.0-1245/spark/lib/spark-assembly-1.6.2.2.5.0.0-1245-hadoop2.7.3.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/spark/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.5.0.0-1245/spark/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.5.0.0-1245/spark/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/current/hadoop-client/conf/:/usr/hdp/2.5.0.0-1245/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-1245.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-s3-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-core-1.10.6.jar:/usr/hdp/current/hadoop-client/lib/aws-java-sdk-kms-1.10.6.jar -Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.history.HistoryServer", "pid": 1975, "hadoop": true, "user": "spark"}, {"command": "/usr/lib/jvm/java/bin/java -Dhdp.version=None -Dspark.executor.memory=512m -Dspark.executor.instances=2 -Dspark.yarn.queue=default -Dfile.encoding=UTF-8 -Xms1024m -Xmx1024m -XX:MaxPermSize=512m -Dlog4j.configuration=file:///usr/hdp/current/zeppelin-server/conf/log4j.properties -Dzeppelin.log.file=/var/log/zeppelin/zeppelin-zeppelin-sandbox.hortonworks.com.log -cp ::/usr/hdp/current/zeppelin-server/lib/*:/usr/hdp/current/zeppelin-server/*::/usr/hdp/current/zeppelin-server/conf org.apache.zeppelin.server.ZeppelinServer", "pid": 2281, "hadoop": false, "user": "zeppelin"}, {"command": "/usr/lib/jvm/java/bin/java -Dproc_historyserver -Xmx250m -Dhdp.version=2.5.0.0-1245 -Djava.net.preferIPv4Stack=true -Djava.io.tmpdir=/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir -Dhdp.version= -Dhdp.version= -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop/mapred -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str=mapred -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop-mapreduce/mapred -Dhadoop.log.file=hadoop.log -Dhadoop.root.logger=INFO,console -Dhadoop.id.str=mapred -Dhdp.version=2.5.0.0-1245 -Dhadoop.log.dir=/var/log/hadoop/mapred -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.0.0-1245/hadoop -Dhadoop.id.str=mapred -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native:/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir:/usr/hdp/2.5.0.0-1245/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.5.0.0-1245/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/var/log/hadoop-mapreduce/mapred -Dhadoop.log.file=mapred-mapred-historyserver-sandbox.hortonworks.com.log -Dhadoop.root.logger=INFO,RFA -Dmapred.jobsummary.logger=INFO,JSA -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer", "pid": 4250, "hadoop": true, "user": "mapred"}], "liveServices": [{"status": "Unhealthy", "name": "ntpd", "desc": "ntpd is stopped\\n"}]}, "reverseLookup": true, "alternatives": [{"name": "hue-conf", "target": "/etc/hue/conf.empty"}], "umask": "18", "firewallName": "iptables", "stackFoldersAndFiles": [{"type": "directory", "name": "/etc/hadoop"}, {"type": "directory", "name": "/etc/hbase"}, {"type": "directory", "name": "/etc/hive"}, {"type": "directory", "name": "/etc/oozie"}, {"type": "directory", "name": "/etc/sqoop"}, {"type": "directory", "name": "/etc/hue"}, {"type": "directory", "name": "/etc/zookeeper"}, {"type": "directory", "name": "/etc/flume"}, {"type": "directory", "name": "/etc/storm"}, {"type": "directory", "name": "/etc/hive-hcatalog"}, {"type": "directory", "name": "/etc/tez"}, {"type": "directory", "name": "/etc/falcon"}, {"type": "directory", "name": "/etc/knox"}, {"type": "directory", "name": "/etc/hive-webhcat"}, {"type": "directory", "name": "/etc/kafka"}, {"type": "directory", "name": "/etc/slider"}, {"type": "directory", "name": "/etc/storm-slider-client"}, {"type": "directory", "name": "/etc/spark"}, {"type": "directory", "name": "/etc/pig"}, {"type": "directory", "name": "/etc/phoenix"}, {"type": "directory", "name": "/etc/ranger"}, {"type": "directory", "name": "/etc/ambari-metrics-collector"}, {"type": "directory", "name": "/etc/ambari-metrics-monitor"}, {"type": "directory", "name": "/etc/atlas"}, {"type": "directory", "name": "/etc/zeppelin"}, {"type": "directory", "name": "/var/run/hadoop"}, {"type": "directory", "name": "/var/run/hbase"}, {"type": "directory", "name": "/var/run/hive"}, {"type": "directory", "name": "/var/run/oozie"}, {"type": "directory", "name": "/var/run/sqoop"}, {"type": "directory", "name": "/var/run/zookeeper"}, {"type": "directory", "name": "/var/run/flume"}, {"type": "directory", "name": "/var/run/storm"}, {"type": "directory", "name": "/var/run/hive-hcatalog"}, {"type": "directory", "name": "/var/run/falcon"}, {"type": "directory", "name": "/var/run/webhcat"}, {"type": "directory", "name": "/var/run/hadoop-yarn"}, {"type": "directory", "name": "/var/run/hadoop-mapreduce"}, {"type": "directory", "name": "/var/run/knox"}, {"type": "directory", "name": "/var/run/kafka"}, {"type": "directory", "name": "/var/run/spark"}, {"type": "directory", "name": "/var/run/ranger"}, {"type": "directory", "name": "/var/run/ambari-metrics-collector"}, {"type": "directory", "name": "/var/run/ambari-metrics-monitor"}, {"type": "directory", "name": "/var/run/atlas"}, {"type": "directory", "name": "/var/run/zeppelin"}, {"type": "directory", "name": "/var/log/hadoop"}, {"type": "directory", "name": "/var/log/hbase"}, {"type": "directory", "name": "/var/log/hive"}, {"type": "directory", "name": "/var/log/oozie"}, {"type": "directory", "name": "/var/log/sqoop"}, {"type": "directory", "name": "/var/log/hue"}, {"type": "directory", "name": "/var/log/zookeeper"}, {"type": "directory", "name": "/var/log/flume"}, {"type": "directory", "name": "/var/log/storm"}, {"type": "directory", "name": "/var/log/hive-hcatalog"}, {"type": "directory", "name": "/var/log/falcon"}, {"type": "directory", "name": "/var/log/webhcat"}, {"type": "directory", "name": "/var/log/hadoop-yarn"}, {"type": "directory", "name": "/var/log/hadoop-mapreduce"}, {"type": "directory", "name": "/var/log/knox"}, {"type": "directory", "name": "/var/log/kafka"}, {"type": "directory", "name": "/var/log/spark"}, {"type": "directory", "name": "/var/log/ranger"}, {"type": "directory", "name": "/var/log/ambari-metrics-collector"}, {"type": "directory", "name": "/var/log/ambari-metrics-monitor"}, {"type": "directory", "name": "/var/log/atlas"}, {"type": "directory", "name": "/var/log/zeppelin"}, {"type": "directory", "name": "/usr/lib/flume"}, {"type": "directory", "name": "/usr/lib/storm"}, {"type": "directory", "name": "/usr/lib/ambari-metrics-collector"}, {"type": "directory", "name": "/var/lib/hive"}, {"type": "directory", "name": "/var/lib/oozie"}, {"type": "directory", "name": "/var/lib/hue"}, {"type": "directory", "name": "/var/lib/flume"}, {"type": "directory", "name": "/var/lib/hadoop-hdfs"}, {"type": "directory", "name": "/var/lib/hadoop-yarn"}, {"type": "directory", "name": "/var/lib/hadoop-mapreduce"}, {"type": "directory", "name": "/var/lib/knox"}, {"type": "directory", "name": "/var/lib/slider"}, {"type": "directory", "name": "/var/lib/spark"}, {"type": "directory", "name": "/var/lib/ranger"}, {"type": "directory", "name": "/var/lib/ambari-metrics-collector"}, {"type": "directory", "name": "/var/lib/zeppelin"}, {"type": "directory", "name": "/var/tmp/oozie"}, {"type": "directory", "name": "/var/tmp/sqoop"}, {"type": "directory", "name": "/tmp/hive"}, {"type": "directory", "name": "/tmp/ambari-qa"}, {"type": "directory", "name": "/tmp/hadoop-hdfs"}, {"type": "directory", "name": "/tmp/ranger"}, {"type": "directory", "name": "/hadoop/oozie"}, {"type": "directory", "name": "/hadoop/zookeeper"}, {"type": "directory", "name": "/hadoop/hdfs"}, {"type": "directory", "name": "/hadoop/storm"}, {"type": "directory", "name": "/hadoop/falcon"}, {"type": "directory", "name": "/hadoop/yarn"}, {"type": "directory", "name": "/kafka-logs"}], "existingUsers": [{"status": "Available", "name": "oozie", "homeDir": "/home/oozie"}, {"status": "Available", "name": "hive", "homeDir": "/home/hive"}, {"status": "Available", "name": "zeppelin", "homeDir": "/home/zeppelin"}, {"status": "Available", "name": "ambari-qa", "homeDir": "/home/ambari-qa"}, {"status": "Available", "name": "flume", "homeDir": "/home/flume"}, {"status": "Available", "name": "hdfs", "homeDir": "/home/hdfs"}, {"status": "Available", "name": "knox", "homeDir": "/home/knox"}, {"status": "Available", "name": "ranger", "homeDir": "/home/ranger"}, {"status": "Available", "name": "storm", "homeDir": "/home/storm"}, {"status": "Available", "name": "spark", "homeDir": "/home/spark"}, {"status": "Available", "name": "mapred", "homeDir": "/home/mapred"}, {"status": "Available", "name": "hbase", "homeDir": "/home/hbase"}, {"status": "Available", "name": "tez", "homeDir": "/home/tez"}, {"status": "Available", "name": "zookeeper", "homeDir": "/home/zookeeper"}, {"status": "Available", "name": "kafka", "homeDir": "/home/kafka"}, {"status": "Available", "name": "falcon", "homeDir": "/home/falcon"}, {"status": "Available", "name": "sqoop", "homeDir": "/home/sqoop"}, {"status": "Available", "name": "yarn", "homeDir": "/home/yarn"}, {"status": "Available", "name": "hcat", "homeDir": "/home/hcat"}, {"status": "Available", "name": "ams", "homeDir": "/home/ams"}, {"status": "Available", "name": "atlas", "homeDir": "/home/atlas"}, {"status": "Available", "name": "hue", "homeDir": "/usr/lib/hue"}, {"status": "Available", "name": "kms", "homeDir": "/var/lib/ranger/kms"}], "firewallRunning": true}, "timestamp": 1477949492486, "hostname": "localhost", "responseId": -1, "publicHostname": "localhost"}') INFO 2016-10-31 21:31:32,560 NetUtil.py:62 - Connecting to https://sandbox.hortonworks.com:8440/connection_info INFO 2016-10-31 21:31:32,619 security.py:100 - SSL Connect being called.. connecting to the server INFO 2016-10-31 21:31:32,685 security.py:61 - SSL connection established. Two-way SSL authentication is turned off on the server. ERROR 2016-10-31 21:31:32,694 Controller.py:180 - Cannot register host with non compatible agent version, hostname=localhost, agentVersion=, serverVersion=2.4.0.0 INFO 2016-10-31 21:31:32,694 Controller.py:463 - Registration response from sandbox.hortonworks.com was FAILED INFO 2016-10-31 21:31:32,694 Controller.py:478 - Registration response from %s didn't contain 'response' as a key INFO 2016-10-31 21:31:32,694 Controller.py:450 - Finished heartbeating and registering cycle INFO 2016-10-31 21:31:32,694 Controller.py:456 - Controller thread has successfully finished INFO 2016-10-31 21:31:32,738 ExitHelper.py:53 - Performing cleanup before exiting... INFO 2016-10-31 21:31:32,738 ExitHelper.py:67 - Cleanup finished, exiting with code:0