Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hue Job Browser is not working

avatar
Explorer

We upgraded our cluster from 5.4 to 5.8.2 version, and after the upgrade we have some problem with Job Browser in the HUE.

The problem is that I can see the running jobs in the YARN resource manager UI, but I don't know why I can't see running jobs in the Hue.

 

I have googled a lot, but couldn't find anything related to it.

 

The only thing that I have found is the yarn resource manager format state store.

 

So I have executed following:

 

#1
stop ressource manager (yarn)

#2
yarn resourcemanager -format-state-store

#3
start ressourcemanager (yarn)

After this the problem has been solved.

 

But after some time (around 3-4 days) the issue appeared again.

 

I could reproduce this problem in two different environments.

 

Do you have some ideas ? 

 

Thanks in advance!

 

11 REPLIES 11

avatar
Super Guru
What do you see in the /logs page of Hue when the problem occurs?
Do you have only one RM?

avatar
Contributor
I have exactly same problem.
We had upgraded CDH from 5.8.0 to 5.8.2

All jobs that were running under 5.8.0 are available in job browser. But right after upgrade YARN jobs does not appear in HUE.

avatar
Explorer

Thanks for your reply.

Please check the log file:  /var/log/hue/error.log  (we have two hue servers running)

And also I have two RMs (active and standby)

 

[24/Oct/2016 15:23:51 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477318534753_0023 (error 404)
[24/Oct/2016 15:23:51 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477318534753_0023 (error 404)
[24/Oct/2016 15:23:51 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477318534753_0023 (error 404)
[24/Oct/2016 15:31:00 +0000] solr         ERROR    Search is not enabled
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/solr.py", line 33, in <module>
    from libsolr.api import SolrApi as NativeSolrApi
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/libsolr/src/libsolr/api.py", line 48, in <module>
    class SolrApi(object):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/libsolr/src/libsolr/api.py", line 52, in SolrApi
    def __init__(self, solr_url, user, security_enabled=SECURITY_ENABLED.get(), ssl_cert_ca_verify=SSL_CERT_CA_VERIFY.get()):
AttributeError: 'Config' object has no attribute 'get'
[24/Oct/2016 15:31:48 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0002 (error 404)
[24/Oct/2016 15:31:48 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0002 (error 404)
[24/Oct/2016 15:31:48 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0002 (error 404)
[24/Oct/2016 15:43:04 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0004 (error 404)
[24/Oct/2016 15:43:04 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0004 (error 404)
[24/Oct/2016 15:43:04 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0004 (error 404)
[24/Oct/2016 15:43:04 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0004 (error 404)
[24/Oct/2016 15:43:04 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0004 (error 404)
[24/Oct/2016 15:43:13 +0000] decorators   ERROR    Error running <function check_status at 0x7f006414ea28>
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/decorators.py", line 81, in decorator
    return func(*args, **kwargs)
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/api.py", line 167, in check_status
    response['query_status'] = get_api(request, snippet).check_status(notebook, snippet)
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 67, in decorator
    return func(*args, **kwargs)
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 248, in check_status
    raise QueryError(operation.errorMessage)
QueryError: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
[24/Oct/2016 15:43:13 +0000] decorators   ERROR    Error running <function check_status at 0x7f006414ea28>
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/decorators.py", line 81, in decorator
    return func(*args, **kwargs)
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/api.py", line 167, in check_status
    response['query_status'] = get_api(request, snippet).check_status(notebook, snippet)
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 67, in decorator
    return func(*args, **kwargs)
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 248, in check_status
    raise QueryError(operation.errorMessage)
QueryError: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
[24/Oct/2016 16:00:05 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0023 (error 404)
[24/Oct/2016 16:00:05 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0023 (error 404)
[24/Oct/2016 16:00:05 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0023 (error 404)
[24/Oct/2016 16:01:03 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0011 (error 404)
[24/Oct/2016 16:01:04 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0011 (error 404)
[24/Oct/2016 16:01:04 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0011 (error 404)
[24/Oct/2016 16:01:04 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0011 (error 404)
[24/Oct/2016 16:09:03 +0000] solr         ERROR    Search is not enabled
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/solr.py", line 33, in <module>
    from libsolr.api import SolrApi as NativeSolrApi
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/libsolr/src/libsolr/api.py", line 48, in <module>
    class SolrApi(object):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/libsolr/src/libsolr/api.py", line 52, in SolrApi
    def __init__(self, solr_url, user, security_enabled=SECURITY_ENABLED.get(), ssl_cert_ca_verify=SSL_CERT_CA_VERIFY.get()):
AttributeError: 'Config' object has no attribute 'get'
[24/Oct/2016 16:10:09 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0028 (error 404)
[24/Oct/2016 16:10:09 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0028 (error 404)
[24/Oct/2016 16:10:09 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477322667025_0028 (error 404)
[24/Oct/2016 16:25:27 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477325247147_0002 (error 404)
[24/Oct/2016 16:25:27 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477325247147_0002 (error 404)
[25/Oct/2016 07:11:15 +0000] cluster      ERROR    RM ha is not available, skipping it: YARN RM returned a failed response: HTTPConnectionPool(host='stage-gap-namenode-2.srv.glispa.com', port=8088): Max retries exceeded with url: /ws/v1/cluster/info (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f16761fe650>: Failed to establish a new connection: [Errno 111] Connection refused',))
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/cluster.py", line 223, in get_next_ha_yarncluster
    cluster_info = rm.cluster()
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/yarn/resource_manager_api.py", line 130, in cluster
    return self._execute(self._root.get, 'cluster/info', params=params, headers={'Accept': _JSON_CONTENT_TYPE})
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/yarn/resource_manager_api.py", line 176, in _execute
    raise PopupException(_('YARN RM returned a failed response: %s') % e)
PopupException: YARN RM returned a failed response: HTTPConnectionPool(host='stage-gap-namenode-2.srv.glispa.com', port=8088): Max retries exceeded with url: /ws/v1/cluster/info (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f16761fe650>: Failed to establish a new connection: [Errno 111] Connection refused',))
[25/Oct/2016 07:11:19 +0000] cluster      ERROR    RM default is not available, skipping it: YARN RM returned a failed response: HTTPConnectionPool(host='stage-gap-namenode-1.srv.glispa.com', port=8088): Max retries exceeded with url: /ws/v1/cluster/info (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f167629da90>: Failed to establish a new connection: [Errno 111] Connection refused',))
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/cluster.py", line 223, in get_next_ha_yarncluster
    cluster_info = rm.cluster()
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/yarn/resource_manager_api.py", line 130, in cluster
    return self._execute(self._root.get, 'cluster/info', params=params, headers={'Accept': _JSON_CONTENT_TYPE})
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/yarn/resource_manager_api.py", line 176, in _execute
    raise PopupException(_('YARN RM returned a failed response: %s') % e)
PopupException: YARN RM returned a failed response: HTTPConnectionPool(host='stage-gap-namenode-1.srv.glispa.com', port=8088): Max retries exceeded with url: /ws/v1/cluster/info (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f167629da90>: Failed to establish a new connection: [Errno 111] Connection refused',))
[25/Oct/2016 07:11:19 +0000] cluster      ERROR    RM ha is not available, skipping it: YARN RM returned a failed response: HTTPConnectionPool(host='stage-gap-namenode-2.srv.glispa.com', port=8088): Max retries exceeded with url: /ws/v1/cluster/info (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f16762a4c10>: Failed to establish a new connection: [Errno 111] Connection refused',))
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/cluster.py", line 223, in get_next_ha_yarncluster
    cluster_info = rm.cluster()
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/yarn/resource_manager_api.py", line 130, in cluster
    return self._execute(self._root.get, 'cluster/info', params=params, headers={'Accept': _JSON_CONTENT_TYPE})
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/hadoop/src/hadoop/yarn/resource_manager_api.py", line 176, in _execute
    raise PopupException(_('YARN RM returned a failed response: %s') % e)
PopupException: YARN RM returned a failed response: HTTPConnectionPool(host='stage-gap-namenode-2.srv.glispa.com', port=8088): Max retries exceeded with url: /ws/v1/cluster/info (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f16762a4c10>: Failed to establish a new connection: [Errno 111] Connection refused',))
[25/Oct/2016 07:23:13 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
  File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 07:23:13 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 07:23:13 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
  File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 07:26:14 +0000] solr         ERROR    Search is not enabled
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/solr.py", line 33, in <module>
    from libsolr.api import SolrApi as NativeSolrApi
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/libsolr/src/libsolr/api.py", line 48, in <module>
    class SolrApi(object):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/libsolr/src/libsolr/api.py", line 52, in SolrApi
    def __init__(self, solr_url, user, security_enabled=SECURITY_ENABLED.get(), ssl_cert_ca_verify=SSL_CERT_CA_VERIFY.get()):
AttributeError: 'Config' object has no attribute 'get'
[25/Oct/2016 07:31:57 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
  File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 07:31:57 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
  File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 07:31:57 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
  File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 07:32:01 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
  File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 07:32:01 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
  File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 07:32:01 +0000] api          ERROR    failed to load the HBase clusters
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/apps/hbase/src/hbase/api.py", line 64, in getClusters
    full_config = json.loads(conf.HBASE_CLUSTERS.get().replace("'", "\""))
  File "/usr/lib/python2.7/json/__init__.py", line 338, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 384, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
[25/Oct/2016 10:07:14 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477388354654_0014 (error 404)
[25/Oct/2016 10:07:14 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477388354654_0014 (error 404)
[25/Oct/2016 10:07:14 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477388354654_0014 (error 404)
[25/Oct/2016 10:07:14 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477388354654_0014 (error 404)
[25/Oct/2016 12:52:29 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477390101339_0034 (error 404)
[25/Oct/2016 12:52:29 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477390101339_0034 (error 404)
[25/Oct/2016 12:52:30 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477390101339_0034 (error 404)
[25/Oct/2016 13:51:34 +0000] yarn_models  ERROR    Failed to get Spark Job executors: no such app: application_1477390101339_0050 (error 404)
[25/Oct/2016 14:19:45 +0000] solr         ERROR    Search is not enabled
Traceback (most recent call last):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/solr.py", line 33, in <module>
    from libsolr.api import SolrApi as NativeSolrApi
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/libsolr/src/libsolr/api.py", line 48, in <module>
    class SolrApi(object):
  File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/libsolr/src/libsolr/api.py", line 52, in SolrApi
    def __init__(self, solr_url, user, security_enabled=SECURITY_ENABLED.get(), ssl_cert_ca_verify=SSL_CERT_CA_VERIFY.get()):
AttributeError: 'Config' object has no attribute 'get'

 

avatar
Explorer
Thanks for your reply.
Please check the log file: /var/log/hue/error.log (we have two hue servers running)
And also I have two RMs (active and standby)

avatar
Super Guru
Could you share your Hue logs?

avatar
Explorer

Hi, 

 

I have already attached the logs. Do you need something else ? 

 

Thanks,

Konstantin

avatar
Explorer

Hi,

I think we might have the same issue. Any update with this?

We are running on cloudera 5.8.2, and are not able to get running jobs in Job Browser.

 

Checked the <blabla>:8888/logs page, nothing interesting.

 

Checked /var/log/hue and found some

 

[14/Nov/2016 17:16:43 +0100] decorators ERROR Error running <function check_status at 0x7f60c99bf2a8>
Traceback (most recent call last):
File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/decorators.py", line 81, in decorator
return func(*args, **kwargs)
File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/api.py", line 167, in check_status
response['query_status'] = get_api(request, snippet).check_status(notebook, snippet)
File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 67, in decorator
return func(*args, **kwargs)
File "/opt/cloudera/parcels/CDH-5.8.2-1.cdh5.8.2.p0.3/lib/hue/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 248, in check_status
raise QueryError(operation.errorMessage)
QueryError: No error message, please check the logs.

 

that *might* be related.

Regards,
Filippo

 

avatar
Explorer

Hello,

 

We also face the issue with Hue 3.11.  It shows only 1000 jobs and the latest jobs are not shown in hue job browser.

 

Any update

 

 

Regards

Thybu

avatar
New Contributor

Hi Guys,

 

We are also having same issue after upgrade from CDH5.8 to CDH 5.10.

 

[12/Apr/2017 08:17:52 +1200] decorators ERROR Error running <function execute at 0x7fd334c5dcf8>
Traceback (most recent call last):
File "/opt/cloudera/parcels/CDH-5.10.0-1.cdh5.10.0.p0.41/lib/hue/desktop/libs/notebook/src/notebook/decorators.py", line 82, in decorator
return func(*args, **kwargs)
File "/opt/cloudera/parcels/CDH-5.10.0-1.cdh5.10.0.p0.41/lib/hue/desktop/libs/notebook/src/notebook/api.py", line 163, in execute
response = _execute_notebook(request, notebook, snippet)
File "/opt/cloudera/parcels/CDH-5.10.0-1.cdh5.10.0.p0.41/lib/hue/desktop/libs/notebook/src/notebook/api.py", line 145, in _execute_notebook
raise ex

 

Any update.