Support Questions
Find answers, ask questions, and share your expertise

Hue service is getting down frequently

Explorer

Hue service is getting down after successful start.

/var/lib/ambari-agent/data/errors-8684.txt:

/usr/lib/python2.6/site-packages/resource_management/core/environment.py:165: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6 

Logger.info("Skipping failure of {0} due to ignore_failures. Failure reason: {1}".format(resource, ex.message))

stdout: /var/lib/ambari-agent/data/output-8684.txt:

2017-11-09 08:14:23,520 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-11-09 08:14:23,625 - Stack Feature Version Info: stack_version=2.5, version=2.5.5.0-157, current_cluster_version=2.5.5.0-157 -> 2.5.5.0-157
2017-11-09 08:14:23,626 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-11-09 08:14:23,627 - Group['livy'] {}
2017-11-09 08:14:23,628 - Group['spark'] {}
2017-11-09 08:14:23,628 - Group['hue'] {}
2017-11-09 08:14:23,628 - Group['hadoop'] {}
2017-11-09 08:14:23,628 - Group['users'] {}
2017-11-09 08:14:23,628 - Group['knox'] {}
2017-11-09 08:14:23,629 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,629 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,630 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,630 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,631 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-11-09 08:14:23,631 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,632 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,632 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-11-09 08:14:23,633 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,633 - User['hue'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,634 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,634 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,635 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,635 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,636 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,636 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-11-09 08:14:23,637 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-11-09 08:14:23,638 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-11-09 08:14:23,641 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-11-09 08:14:23,642 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-11-09 08:14:23,643 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-11-09 08:14:23,644 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-11-09 08:14:23,647 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-11-09 08:14:23,649 - Group['hdfs'] {}
2017-11-09 08:14:23,650 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2017-11-09 08:14:23,650 - FS Type: 
2017-11-09 08:14:23,650 - Directory['/etc/hadoop'] {'mode': 0755}
2017-11-09 08:14:23,662 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-11-09 08:14:23,663 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-11-09 08:14:23,674 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2017-11-09 08:14:23,685 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2017-11-09 08:14:23,686 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2017-11-09 08:14:23,687 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2017-11-09 08:14:23,690 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2017-11-09 08:14:23,692 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2017-11-09 08:14:23,697 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2017-11-09 08:14:23,706 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
2017-11-09 08:14:23,707 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2017-11-09 08:14:23,708 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2017-11-09 08:14:23,711 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2017-11-09 08:14:23,714 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2017-11-09 08:14:23,888 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-11-09 08:14:23,891 - Execute['ps -ef | grep hue | grep -v grep | awk  '{print $2}' | xargs kill -9'] {'ignore_failures': True, 'user': 'hue'}
2017-11-09 08:14:23,938 - Skipping failure of Execute['ps -ef | grep hue | grep -v grep | awk  '{print $2}' | xargs kill -9'] due to ignore_failures. Failure reason: Execution of 'ps -ef | grep hue | grep -v grep | awk  '{print $2}' | xargs kill -9' returned 137. kill 100040: Operation not permitted
kill 100063: No such process
kill 100066: No such process
-bash: line 1: 100063 Done                    ps -ef
     100064                       | grep hue
     100065                       | grep -v grep
     100066                       | awk '{print $2}'
     100067 Killed                  | xargs kill -9
2017-11-09 08:14:23,939 - File['/var/run/hue/hue-server.pid'] {'owner': 'hue', 'action': ['delete']}
2017-11-09 08:14:23,939 - Deleting File['/var/run/hue/hue-server.pid']
2017-11-09 08:14:23,941 - Configure Hue Service
2017-11-09 08:14:23,941 - Directory['/var/log/hue'] {'owner': 'hue', 'create_parents': True, 'group': 'hue', 'mode': 0755, 'cd_access': 'a'}
2017-11-09 08:14:23,942 - Directory['/var/run/hue'] {'owner': 'hue', 'group': 'hue', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2017-11-09 08:14:23,942 - File['/var/log/hue/hue-install.log'] {'content': '', 'owner': 'hue', 'group': 'hue', 'mode': 0644}
2017-11-09 08:14:23,943 - Writing File['/var/log/hue/hue-install.log'] because contents don't match
2017-11-09 08:14:23,943 - File['/var/run/hue/hue-server.pid'] {'content': '', 'owner': 'hue', 'group': 'hue', 'mode': 0644}
2017-11-09 08:14:23,943 - Writing File['/var/run/hue/hue-server.pid'] because it doesn't exist
2017-11-09 08:14:23,943 - Changing owner for /var/run/hue/hue-server.pid from 0 to hue
2017-11-09 08:14:23,943 - Changing group for /var/run/hue/hue-server.pid from 0 to hue
2017-11-09 08:14:23,944 - Creating symlinks /usr/hdp/current/hadoop-client/lib/hue-plugins-3.11.0-SNAPSHOT.jar
2017-11-09 08:14:23,944 - Link['/usr/local/hue/desktop/libs/hadoop/java-lib/*'] {'to': '/usr/hdp/current/hadoop-client/lib'}
2017-11-09 08:14:23,944 - Link['/usr/local/hue/desktop/libs/hadoop/java-lib/*'] replacing old symlink to /usr/hdp/2.5.5.0-157/hadoop/lib
2017-11-09 08:14:23,944 - Creating symbolic Link['/usr/local/hue/desktop/libs/hadoop/java-lib/*'] to /usr/hdp/current/hadoop-client/lib
2017-11-09 08:14:23,944 - Execute['find /var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package -iname "*.sh" | xargs chmod +x'] {}
2017-11-09 08:14:23,948 - HdfsResource['/user/hue'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'dfs_type': '', 'default_fs': 'hdfs://ip-192-168-0-42.eu-west-1.compute.internal:8020', 'user': 'hdfs', 'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': [EMPTY], 'recursive_chmod': True, 'owner': 'hue', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp'], 'mode': 0755}
2017-11-09 08:14:23,951 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://ip-192-168-0-42.eu-west-1.compute.internal:50070/webhdfs/v1/user/hue?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpcu6J43 2>/tmp/tmpXsH5Ic''] {'logoutput': None, 'quiet': False}
2017-11-09 08:14:23,984 - call returned (0, '')
2017-11-09 08:14:23,985 - Creating /usr/local/hue/desktop/conf/log.conf file
2017-11-09 08:14:23,989 - File['/usr/local/hue/desktop/conf/log.conf'] {'owner': 'hue', 'content': InlineTemplate(...)}
2017-11-09 08:14:23,989 - Creating /usr/local/hue/desktop/conf/pseudo-distributed.ini config file
2017-11-09 08:14:24,070 - File['/usr/local/hue/desktop/conf/pseudo-distributed.ini'] {'owner': 'hue', 'content': InlineTemplate(...)}
2017-11-09 08:14:24,071 - Run the script file to add configurations
2017-11-09 08:14:24,072 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc hdfs-site 'dfs.namenode.acls.enabled' 'true''] {}
2017-11-09 08:14:24,485 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc core-site 'hadoop.proxyuser.hue.hosts' '*''] {}
2017-11-09 08:14:24,746 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc core-site 'hadoop.proxyuser.hbase.hosts' '*''] {}
2017-11-09 08:14:25,013 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc core-site 'hadoop.proxyuser.hive.groups' '*''] {}
2017-11-09 08:14:25,274 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc core-site 'hadoop.proxyuser.hbase.groups' '*''] {}
2017-11-09 08:14:25,536 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc core-site 'hadoop.proxyuser.hue.groups' '*''] {}
2017-11-09 08:14:25,806 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc core-site 'hadoop.proxyuser.spark.groups' '*''] {}
2017-11-09 08:14:26,066 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc core-site 'hadoop.proxyuser.hive.hosts' '*''] {}
2017-11-09 08:14:26,335 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc core-site 'hadoop.proxyuser.spark.hosts' '*''] {}
2017-11-09 08:14:26,596 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc webhcat-site 'webhcat.proxyuser.hue.groups' '*''] {}
2017-11-09 08:14:26,852 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc webhcat-site 'webhcat.proxyuser.hue.hosts' '*''] {}
2017-11-09 08:14:27,089 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc hive-site 'hive.security.authorization.sqlstd.confwhitelist.append' 'hive.server2.logging.operation.verbose''] {}
2017-11-09 08:14:27,909 - Execute['/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/files/configs.sh set ip-192-168-0-36.eu-west-1.compute.internal technipFMC_poc livy-conf 'livy.server.csrf_protection.enabled' 'false''] {}
2017-11-09 08:14:28,053 - Execute['/usr/local/hue/build/env/bin/supervisor >> /var/log/hue/hue-install.log 2>&1 &'] {'environment': {'SPARK_HOME': '/usr/hdp/current/spark-client', 'HADOOP_CONF_DIR': '/usr/hdp/current/hadoop-client/conf', 'JAVA_HOME': '/usr/jdk64/jdk1.8.0_112', 'PATH': '$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin'}, 'user': 'hue'}
2017-11-09 08:14:28,084 - Execute['ps -ef | grep hue | grep supervisor | grep -v grep | awk '{print $2}' > /var/run/hue/hue-server.pid'] {'user': 'hue'}
 

Command completed successfully!