- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Installation of Hortonworks suite
- Labels:
-
Apache HBase
-
Apache Pig
-
Apache Sqoop
-
HDFS
Created on 06-15-2016 01:23 PM - edited 09-16-2022 03:25 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
In my Organization we have currently Datawarehouse setup. We want to migrate it to HDFS.
I need an HDFS, sqoop, five, hbase, pig at bear minimum + the admin side details for this.
Can anybody help me in this. It will help me a lot
Created 06-17-2016 09:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Steps to setup cluster:-
There are some pre-requisites required for hadoop that we need to take care of before we set up the cluster. We will do them one by one now.
1) Setup password-less ssh from master to slaves.
2) Update /etc/hosts file
3) Update it on each node to contain entries for all hosts in the cluster.
4) install Java
5) disable seliux and iptable
6)Download amabari repo
wget -nv http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.1.0/ambari.repo -O /etc/yum.repos.d/ambari.repo
7) install amabari
yum install ambari-server
😎 run command to setup amabari
ambari-server setup and press enter for all question
9) Now start the ambari server
ambari-server start
10) Once service is started successfully then check UI at <ambari IP>:8080
11) You will see UI. Default login credentials are admin:admin (username:password)
For the next steps follow the link:-
If this is help you please accept the ans and lets close this.
Created 06-15-2016 04:38 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
loaded question. Can you be more specific?
Created 06-17-2016 02:24 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created 06-16-2016 05:41 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Sunile Manjee : I need all the information which is required to setup an hadoop environment using HORTON WORKS Suite and also the admin part ( which things to consider).
Created 06-17-2016 09:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Steps to setup cluster:-
There are some pre-requisites required for hadoop that we need to take care of before we set up the cluster. We will do them one by one now.
1) Setup password-less ssh from master to slaves.
2) Update /etc/hosts file
3) Update it on each node to contain entries for all hosts in the cluster.
4) install Java
5) disable seliux and iptable
6)Download amabari repo
wget -nv http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.1.0/ambari.repo -O /etc/yum.repos.d/ambari.repo
7) install amabari
yum install ambari-server
😎 run command to setup amabari
ambari-server setup and press enter for all question
9) Now start the ambari server
ambari-server start
10) Once service is started successfully then check UI at <ambari IP>:8080
11) You will see UI. Default login credentials are admin:admin (username:password)
For the next steps follow the link:-
If this is help you please accept the ans and lets close this.
Created 06-17-2016 12:15 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Apart from the above answers of Ashnee and Sunile the number of question that need to be answered to deploy correctly the HDP components
1. What is the business case for this cluster build? DWH???
2. Is it a POC or intended for production?
3. How many nodes in this cluster ? For HA you need to consider NN ,DN and zookeeper redundancy etc
4. Definitely with RDBMS you will need sqoop but what other components do you want deployed?
5. Remember to build your Hadoop architecture to map your business needs
@Ashnee
You forgot to mention NTP synchronization between the nodes in the cluster which is very important for the zookeeper etc
Created 06-17-2016 12:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes right. Please install ntp, refer hte link
http://www.openkb.info/2014/06/ntp-tips-for-hadoop-cluster.html
Created 07-12-2016 08:36 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
If you got the answer, then lets close this.
Created 10-04-2016 09:46 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
1)Can you update how much data is there?
2) You want to build poc cluster or production?