Support Questions

Find answers, ask questions, and share your expertise

Cloudbreak 2.4 Cluster creation failed

avatar
New Contributor

Hello, Im trying to create a HDP 2.6.1.3 cluster using the latest cloudbreak 2.4 GA on AWS.

Repository details

Ambari Version: 2.6.1.0

Stack Repository Version: 2.6.4.0-91



Its failing on Hive Metastore Start with the following error.

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 203, in <module>
    HiveMetastore().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 54, in start
    self.configure(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 120, in locking_configure
    original_configure(obj, *args, **kw)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 72, in configure
    hive(name = 'metastore')
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 310, in hive
    jdbc_connector(params.hive_jdbc_target, params.hive_previous_jdbc_jar)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 527, in jdbc_connector
    content = DownloadSource(params.driver_curl_source))
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 123, in action_create
    content = self._get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 160, in _get_content
    return content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 52, in __call__
    return self.get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 197, in get_content
    raise Fail("Failed to download file from {0} due to HTTP error: {1}".format(self.url, str(ex)))
resource_management.core.exceptions.Fail: Failed to download file from http://ip-10-4-1-126.eu-west-1.compute.internal:8080/resources//mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 203, in <module>
    HiveMetastore().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py", line 54, in start
    self.configure(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 120, in locking_configure

I have tried this on number of blueprints eg. Data Science: Apache Spark 1.6, Apache Zeppelin 0.7.0 and its have the same issue.

Has anyone else seen this issue, or have a suggestion on how to work around this?

Thanks in advance

TL

1 ACCEPTED SOLUTION

avatar

Hi @Tok Luo,

If you would like to us Ambari 2.6 with Cloudbreak, you should use 2.6.1.3 at this point. See https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.4.0/content/releasenotes/index.htm... > "Support for Ambari 2.6 (2.6.1.3+)"

This link also explains what you need to consider when using 2.6.1.3:

  • Ambari 2.6.1 or newer does not install the mysqlconnector; therefore, when creating a blueprint for Ambari 2.6.1 or newer you should not include the MYSQL_SERVER component for Hive Metastore in your blueprint. Instead, you have two options:
    • Configure an external RDBMS instance for Hive Metastore and include the JDBC connection information in your blueprint. If you choose to use an external database that is not PostgreSQL (such as Oracle, mysql) you must also set up Ambari with the appropriate connector; to do this, create a pre-ambari-start recipe and pass it when creating a cluster.
    • If a remote Hive RDBMS is not provided, Cloudbreak installs a Postgres instance and configures it for Hive Metastore during the cluster launch.

    For information on how to configure an external database and pass your external database connection parameters, refer to Ambari blueprint documentation.

  • If you would like to use Oozie, you must manually install Ext JS. The steps are described in Cannot Access Oozie Web UI.
  • To enable LZO compression in your HDP cluster, you must check the "Enable Ambari Server to download and install GPL Licensed LZO packages?" during cluster creation. The option is available under Security > Prewarmed and Base Images.

View solution in original post

3 REPLIES 3

avatar
Expert Contributor

Hi @Tok Luo

Could you check please that you have no MYSQL_SERVER component in the Blueprint? If you want to use MySQL you have to install the connector manually (eg. with a recipe) as it's support has been removed from Ambari:

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.0/bk_ambari-release-notes/content/ambari_reln...

If no MYSQL_SERVER is included then Cloudbreak should install and use PostgreSQL for Hive.

avatar

Hi @Tok Luo,

If you would like to us Ambari 2.6 with Cloudbreak, you should use 2.6.1.3 at this point. See https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.4.0/content/releasenotes/index.htm... > "Support for Ambari 2.6 (2.6.1.3+)"

This link also explains what you need to consider when using 2.6.1.3:

  • Ambari 2.6.1 or newer does not install the mysqlconnector; therefore, when creating a blueprint for Ambari 2.6.1 or newer you should not include the MYSQL_SERVER component for Hive Metastore in your blueprint. Instead, you have two options:
    • Configure an external RDBMS instance for Hive Metastore and include the JDBC connection information in your blueprint. If you choose to use an external database that is not PostgreSQL (such as Oracle, mysql) you must also set up Ambari with the appropriate connector; to do this, create a pre-ambari-start recipe and pass it when creating a cluster.
    • If a remote Hive RDBMS is not provided, Cloudbreak installs a Postgres instance and configures it for Hive Metastore during the cluster launch.

    For information on how to configure an external database and pass your external database connection parameters, refer to Ambari blueprint documentation.

  • If you would like to use Oozie, you must manually install Ext JS. The steps are described in Cannot Access Oozie Web UI.
  • To enable LZO compression in your HDP cluster, you must check the "Enable Ambari Server to download and install GPL Licensed LZO packages?" during cluster creation. The option is available under Security > Prewarmed and Base Images.

avatar
New Contributor
@mmolnar @Dominika Bialek

Thank you for your response. Its now working.

Regards

Tok