Member since
01-07-2019
217
Posts
135
Kudos Received
18
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1930 | 12-09-2021 09:57 PM | |
1864 | 10-15-2018 06:19 PM | |
9240 | 10-10-2018 07:03 PM | |
4025 | 07-24-2018 06:14 PM | |
1478 | 07-06-2018 06:19 PM |
08-24-2018
08:48 PM
1 Kudo
In the following video, I demonstrate how to create an interactive Cloudbreak credential on Azure Video link: https://youtu.be/Q9fqmOZqnnw The Cloudbreak version used in this video is Cloudbreak 2.7.1.
If you are using a corporate Azure account, you may be unable to use this method due to the fact that the Owner role is required. In this case, you should use the app-based credential method. See Create an app-based credential documentation. If you are using a Cloudbreak version different than 2.7.1, refer to the equivalent documentation for that version. You can access Cloudbreak documentation from the Hortonworks docs page at https://docs.hortonworks.com.
... View more
Labels:
08-24-2018
08:34 PM
2 Kudos
In the following video, I demonstrate how to create a role-based Cloudbreak credential on AWS
Video link: https: / /youtu.be /XlvUahqwThI
The Cloudbreak version used in this video is Cloudbreak 2.7.1.
To obtain the policy that need to be assigned to the user, refer to Create CredentialRole documentation. If you are using a corporate AWS account you may be unable to perform some of the steps (such as role creation and policy assignment) by yourself and you may have to contact your AWS admin to perform these steps for you. If you are using a Cloudbreak version different than 2.7.1, refer to the equivalent documentation for that version. You can access Cloudbreak documentation from the Hortonworks docs page at https://docs.hortonworks.com.
... View more
Labels:
08-17-2018
07:37 PM
@sunile.manjee I updated the tutorial to include @pdarvasi's suggestion as a note.
... View more
07-26-2018
09:50 PM
3 Kudos
Starting with HDP 2.6.5, it is possible to configure access to Google Cloud Storage from HDP via the gs cloud storage connector. There are two ways to do this: - You can use Cloudbreak to deploy clusters on GCP and help you configure access to GCS from your Cloudbreak-managed HDP cluster. The steps involve (1) Setting up a service account on Google Cloud and (2) Providing the Service Account Email Address to Cloudbreak in the cluster create wizard on the Cloud Storage page. For detailed steps, see https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.7.1/content/gcp-data/index.html - It is also possible to configure access to GCS by setting up a service account, manually placing the service account key on all nodes of the cluster, and setting related properties in Ambari. For detailed steps, see https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.0/bk_cloud-data-access/content/authentication-gcp.html Note that this capability is currently only available as technical preview.
... View more
Labels:
07-26-2018
06:17 PM
@Vipin Gupta From the screenshot, it looks like your error is related to upgrade or some other issues, and not to the parameters described in this thread. I would recommend that you post this as a new thread.
... View more
07-25-2018
06:31 PM
@Paul Norris It is possible to use Cloudbreak 2.7.1 to deploy HDP 3.0; However, Cloudbreak 2.7.1 does not include any default blueprints for HDP 3.0 and so if you want to use HDP 3.0 you must first: 1) create an HDP 3.0 blueprint. Here is an example: {
"Blueprints": {
"blueprint_name": "hdp30-data-science-spark2-v4",
"stack_name": "HDP",
"stack_version": "3.0"
},
"settings": [
{
"recovery_settings": []
},
{
"service_settings": [
{
"name": "HIVE",
"credential_store_enabled": "false"
}
]
},
{
"component_settings": []
}
],
"configurations": [
{
"core-site": {
"fs.trash.interval": "4320"
}
},
{
"hdfs-site": {
"dfs.namenode.safemode.threshold-pct": "0.99"
}
},
{
"hive-site": {
"hive.exec.compress.output": "true",
"hive.merge.mapfiles": "true",
"hive.server2.tez.initialize.default.sessions": "true",
"hive.server2.transport.mode": "http"
}
},
{
"mapred-site": {
"mapreduce.job.reduce.slowstart.completedmaps": "0.7",
"mapreduce.map.output.compress": "true",
"mapreduce.output.fileoutputformat.compress": "true"
}
},
{
"yarn-site": {
"yarn.acl.enable": "true"
}
}
],
"host_groups": [
{
"name": "master",
"configurations": [],
"components": [
{
"name": "APP_TIMELINE_SERVER"
},
{
"name": "HDFS_CLIENT"
},
{
"name": "HISTORYSERVER"
},
{
"name": "HIVE_CLIENT"
},
{
"name": "HIVE_METASTORE"
},
{
"name": "HIVE_SERVER"
},
{
"name": "JOURNALNODE"
},
{
"name": "MAPREDUCE2_CLIENT"
},
{
"name": "METRICS_COLLECTOR"
},
{
"name": "METRICS_MONITOR"
},
{
"name": "NAMENODE"
},
{
"name": "RESOURCEMANAGER"
},
{
"name": "SECONDARY_NAMENODE"
},
{
"name": "LIVY2_SERVER"
},
{
"name": "SPARK2_CLIENT"
},
{
"name": "SPARK2_JOBHISTORYSERVER"
},
{
"name": "TEZ_CLIENT"
},
{
"name": "YARN_CLIENT"
},
{
"name": "ZEPPELIN_MASTER"
},
{
"name": "ZOOKEEPER_CLIENT"
},
{
"name": "ZOOKEEPER_SERVER"
}
],
"cardinality": "1"
},
{
"name": "worker",
"configurations": [],
"components": [
{
"name": "HIVE_CLIENT"
},
{
"name": "TEZ_CLIENT"
},
{
"name": "SPARK2_CLIENT"
},
{
"name": "DATANODE"
},
{
"name": "METRICS_MONITOR"
},
{
"name": "NODEMANAGER"
}
],
"cardinality": "1+"
},
{
"name": "compute",
"configurations": [],
"components": [
{
"name": "HIVE_CLIENT"
},
{
"name": "TEZ_CLIENT"
},
{
"name": "SPARK2_CLIENT"
},
{
"name": "METRICS_MONITOR"
},
{
"name": "NODEMANAGER"
}
],
"cardinality": "1+"
}
]
} 2) Upload the blueprint to Cloudbreak (you can paste it under the Blueprints menu item). 3) When creating a cluster: - Under General Configuration, select Platform Version > HDP-3.0 and then your blueprint should appear under Cluster Type - Under Image Settings, specify Ambari 2.7 and HDP 3.0 public repos (you can find them in Ambari 2.7 docs: https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.0.0/bk_ambari-installation/content/ch_obtaining-public-repos.html).
... View more
07-24-2018
07:05 PM
@Paul Norris I'm glad it worked for you. We will look into the Cloudbreak deployment problem from the Azure Marketplace. CC @pdarvasi
... View more
07-24-2018
06:14 PM
@Paul Norris, Why don't you try using the following link instead of the Cloudbreak available on the Azure Marketplace. This installs Cloudbreak 2.7.1 while the Azure Marketplace version is older: https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.7.1/content/azure-quick/index.html After navigating to this page, click on the blue button to open the install template.
... View more
07-12-2018
06:28 PM
1 Kudo
Adding @Sarah Olson In case she can help with this
... View more