Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Data Analytics UI not working

New Contributor

Hello, I was following the tutorial getting started with hdp and was trying to run data analytics studio UI. However, when I start the service and click on Data Analytics Studio UI, I get 502 bad gateway error.

I have tried all the solutions such as changing the IP to localhost:30800/# and made sure the hosts file is correct. However, I have not been able to fix this issue.

I have also gone through the previous posts about the same issue but I could not find any answer that worked. Any help would be greatly appreciated.

6 REPLIES 6

Mentor

@Srishan 

I just posted a response to a similar question. Did you reset the root password and ambari-admin-password-reset also? The sandbox should work out of the box without much tweaking

New Contributor

Hi Shelton,

 

Yes, I did. I perfectly followed the tutorial of learning the ropes of HDP. And also, the data analytics studio service stops after about a minute or two. I'm not sure about that as well.

Mentor

@Srishan 

 

I want you to really confirm you followed the Cloudera document because by experience I have at time struggled to help members with issues, really starting from the basics and asking whether they, for example, if they followed the preparation of the environment a pre-requisite and they confirm the affirmative and in end after exhausting all avenues passionately trying to help then someone tells you it was the firewall or NTP after spending 3 to 4 days exchanging email 
Having said that ,can you share the logs  so I can analyze

New Contributor
stderr: 
None
 stdout:
2019-10-16 20:35:11,899 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2019-10-16 20:35:11,976 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-10-16 20:35:12,622 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=3.0.1.0-187 -> 3.0.1.0-187
2019-10-16 20:35:12,643 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-10-16 20:35:12,649 - Group['livy'] {}
2019-10-16 20:35:12,652 - Group['spark'] {}
2019-10-16 20:35:12,653 - Group['ranger'] {}
2019-10-16 20:35:12,653 - Group['hdfs'] {}
2019-10-16 20:35:12,653 - Group['zeppelin'] {}
2019-10-16 20:35:12,654 - Group['hadoop'] {}
2019-10-16 20:35:12,654 - Group['users'] {}
2019-10-16 20:35:12,654 - Group['knox'] {}
2019-10-16 20:35:12,657 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,660 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,664 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,668 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,671 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,674 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,677 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-10-16 20:35:12,680 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,683 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,687 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2019-10-16 20:35:12,690 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-10-16 20:35:12,694 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2019-10-16 20:35:12,698 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2019-10-16 20:35:12,701 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,704 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2019-10-16 20:35:12,707 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-10-16 20:35:12,711 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,714 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2019-10-16 20:35:12,717 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,720 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,724 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,729 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-10-16 20:35:12,732 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2019-10-16 20:35:12,734 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-10-16 20:35:12,739 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-10-16 20:35:12,747 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-10-16 20:35:12,748 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2019-10-16 20:35:12,751 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-10-16 20:35:12,755 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-10-16 20:35:12,759 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2019-10-16 20:35:12,770 - call returned (0, '1015')
2019-10-16 20:35:12,771 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2019-10-16 20:35:12,778 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] due to not_if
2019-10-16 20:35:12,779 - Group['hdfs'] {}
2019-10-16 20:35:12,780 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2019-10-16 20:35:12,781 - FS Type: HDFS
2019-10-16 20:35:12,781 - Directory['/etc/hadoop'] {'mode': 0755}
2019-10-16 20:35:12,835 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2019-10-16 20:35:12,836 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-10-16 20:35:12,873 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2019-10-16 20:35:12,879 - Skipping Execute[('setenforce', '0')] due to not_if
2019-10-16 20:35:12,880 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2019-10-16 20:35:12,888 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2019-10-16 20:35:12,889 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'cd_access': 'a'}
2019-10-16 20:35:12,890 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2019-10-16 20:35:12,903 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2019-10-16 20:35:12,908 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2019-10-16 20:35:12,929 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2019-10-16 20:35:12,966 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2019-10-16 20:35:12,967 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2019-10-16 20:35:12,970 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2019-10-16 20:35:12,984 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2019-10-16 20:35:12,990 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2019-10-16 20:35:12,997 - Skipping unlimited key JCE policy check and setup since the Java VM is not managed by Ambari
2019-10-16 20:35:13,026 - Skipping stack-select on DATA_ANALYTICS_STUDIO because it does not exist in the stack-select package structure.
2019-10-16 20:35:13,719 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-10-16 20:35:13,724 - Directory['/etc/das'] {'owner': 'hive', 'create_parents': True, 'mode': 0755}
2019-10-16 20:35:13,728 - Directory['/etc/das/conf'] {'owner': 'hive', 'create_parents': True, 'mode': 0755}
2019-10-16 20:35:13,729 - Directory['/usr/das/1.0.2.0-6/data_analytics_studio'] {'owner': 'hive', 'create_parents': True, 'mode': 0755}
2019-10-16 20:35:13,729 - Directory['/var/log/das'] {'owner': 'hive', 'create_parents': True, 'mode': 0755}
2019-10-16 20:35:13,739 - File['/etc/das/conf/das.conf'] {'owner': 'hive', 'content': InlineTemplate(...), 'mode': 0400}
2019-10-16 20:35:13,741 - PropertiesFile['/etc/das/conf/das-hive-site.conf'] {'owner': 'hive', 'mode': 0400, 'properties': {'hive.server2.enable.doAs': True, 'hive.server2.zookeeper.namespace': u'hiveserver2', 'hive.server2.support.dynamic.service.discovery': 'true', 'hive.zookeeper.quorum': u'sandbox-hdp.hortonworks.com:2181'}}
2019-10-16 20:35:13,754 - Generating properties file: /etc/das/conf/das-hive-site.conf
2019-10-16 20:35:13,754 - File['/etc/das/conf/das-hive-site.conf'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': None, 'mode': 0400, 'encoding': 'UTF-8'}
2019-10-16 20:35:13,761 - Writing File['/etc/das/conf/das-hive-site.conf'] because contents don't match
2019-10-16 20:35:13,762 - PropertiesFile['/etc/das/conf/das-hive-interactive-site.conf'] {'owner': 'hive', 'mode': 0400, 'properties': {'hive.server2.support.dynamic.service.discovery': 'true', 'hive.server2.zookeeper.namespace': u'hiveserver2-hive2', 'hive.zookeeper.quorum': u'sandbox-hdp.hortonworks.com:2181'}}
2019-10-16 20:35:13,773 - Generating properties file: /etc/das/conf/das-hive-interactive-site.conf
2019-10-16 20:35:13,774 - File['/etc/das/conf/das-hive-interactive-site.conf'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': None, 'mode': 0400, 'encoding': 'UTF-8'}
2019-10-16 20:35:13,780 - Writing File['/etc/das/conf/das-hive-interactive-site.conf'] because contents don't match
2019-10-16 20:35:13,810 - File['/etc/das/conf/das-webapp.json'] {'owner': 'hive', 'content': InlineTemplate(...), 'mode': 0400}
2019-10-16 20:35:13,815 - File['/etc/das/conf/das-webapp-env.sh'] {'owner': 'hive', 'content': InlineTemplate(...), 'mode': 0400}
2019-10-16 20:35:13,818 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'cat /usr/das/1.0.2.0-6/data_analytics_studio/das-webapp.pid 1>/tmp/tmpXvXEUV 2>/tmp/tmpCsgdk9''] {'quiet': False}
2019-10-16 20:35:13,918 - call returned (0, '')
2019-10-16 20:35:13,918 - get_user_call_output returned (0, u'1141', u'')
2019-10-16 20:35:13,920 - Execute['source /etc/das/conf/das-webapp-env.sh; /usr/das/current/data_analytics_studio/bin/das-webapp start'] {'environment': {'HADOOP_CONF': u'/usr/hdp/3.0.1.0-187/hadoop/conf', 'JAVA_HOME': u'/usr/lib/jvm/java'}, 'not_if': 'ls /usr/das/1.0.2.0-6/data_analytics_studio/das-webapp.pid > /dev/null 2>&1 && ps -p 1141 >/dev/null 2>&1', 'user': 'hive'}
2019-10-16 20:35:20,324 - Component has started with pid(s): 23436
2019-10-16 20:35:20,358 - Skipping stack-select on DATA_ANALYTICS_STUDIO because it does not exist in the stack-select package structure.

Command completed successfully!

Hi Shelton,

Above is the log file generated from data analytics studio.

I have tried executing the data analytics studio UI by disabling firewall on my host machine (Windows 10), yet I get the same error. Please let me know if I'm doing something wrong.

New Contributor

Hi @Srishan , @Shelton. I'm experiencing exactly same issue.

 

Changes I have done in my container since I started using Sandbox HDP 3:

Installed Cassandra and MongoDB. I had a prior container running same Sandbox, (I'm running in docker for Mac) without there NoSQL and DAS was working perfectly.

Now I have Apache Drill in the same container, but DAS already struggling to run before I put the Drill in the container.

 

Thanks in advance!

 

Best regards.

New Contributor

UPDATE:

 

@Srishan@Shelton, I just selected Config V1 and reapplied them as a current config (restarted all affected) and DAS web ui is back working.

 

 

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.