Member since
08-28-2015
194
Posts
45
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2212 | 07-05-2017 11:58 PM |
06-01-2017
02:41 AM
All my nodes and services have this same error --Connection failed to http://ip-172-31-18-1.ca-central-1.comp...'. I believe I setup passwdless correctly(I done it many times and all worked before), my security group to setup all TCP, all UDP and SSH. do any one think this is AWS/EC2/DNS problem? since all connection fails, no exception. any idea? thank you so much for your help.
... View more
Labels:
- Labels:
-
Apache Ambari
05-31-2017
07:13 PM
thanks, I got your another post and tried ambari 2.2 so dont need to deal with smartsense.
... View more
05-31-2017
06:27 PM
still like to know the best combination of RHEL version and ambari.repo version. document says it will fit all, but different error comeup.
... View more
05-31-2017
06:11 PM
yes, the smartsense error is not showing any more. thank you so much for that.
... View more
05-31-2017
05:18 PM
thank you for your reply. 1. I tried all conbination of AWS/EC2 RHEL 6. 7 and ambari 2.4, 2.2. I can not avoid smartsense error, also, the ambari.repo download have no smartsense included 2. in order to download smartsense, the account to hortonworks.support.com is needed, have no idea to create a new account on it 3. I need a combination of RHEL version and ambari version and HDP version from you in order not deal with smartsense, if you have one. Meanwhile, I will try ambari.repo 2.2. hope I will not have issues with smartsense. your help is greatly appreciated. best answer to you.
... View more
05-31-2017
04:24 PM
when install ambari , the error repeatedly asking for smartsense package, it is not in ambari.repo, and only can be downloaded from hortonworks.support.com, and it need account info. I like to know how to create a new user account at this website.
... View more
Labels:
- Labels:
-
Hortonworks SmartSense
05-31-2017
12:01 PM
BTW, we dont have HDP support here. we just test it for a project.
... View more
05-31-2017
11:58 AM
thank you for this info, however, it says 'For new installs, SmartSense will be installed by default. Entering your Hortonworks SmartSense ID is still required to activate SmartSense. No bundles will be captured and sent until the a valid SmartSense ID is entered.' I tried ambari 2.5, it asked SmartSense ID and provided my company email. I got nothing so far. so without smartsense ID, you never can install HDP, right?
... View more
05-31-2017
05:41 AM
I have no account to log onto https://support.hortonworks.com. how to get this smarsense package then?
... View more
- Tags:
- hst
- service-account
Labels:
- Labels:
-
Hortonworks SmartSense
05-22-2017
04:52 PM
According to the mongodb doc here, https://www.mongodb.com/blog/post/in-the-loop-with-hadoop-take-advantage-of-data-locality-with-mongo-hadoop This means that we can accomplish local reads by putting a DataNode process (for Hadoop) or a Spark executor on the same node with a MongoDB shard and a mongos. In other words, we have an equal number of MongoDB shards and Hadoop or Spark worker nodes. We provide the addresses to all these mongos instances to the connector, so that the connector will automatically tag each InputSplit with the address of the mongos that lives together with the shard that holds the data for the split. the above mentioned that we have an equal number of MongoDB shards and Hadoop or Spark worker nodes. I think it means number of mongodb shard = data nodes with spark. so my next step is to ensure sparks installed on all data nodes and then mongodb shards on these worker nodes. Let me know if you think differently. thanks Robin
... View more
05-22-2017
02:48 PM
I installed HDP cluster with 4 data nodes. and add service mongodb/mongos/mongodconfig, which is mongodb with sharding I tried to add them to the HDP data nodes (2 grp and 2 nodes each). Somehow, the mongodb and mongos are unable to start. so I start wondering, 1. if HDP, mongodb/mongos, should we still need data nodes in the system? 2. datanode and mongodb sharding nodes can be on the same node? 3. if not, Can we have 0 datanode in HDP cluster? 4. the kafka is in the system, we need to use kafka/spark, so data nodes are needed for spark/scala RDD in this architecture, right? Any one have any clue on this, please help. thank you very much. Robin
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Kafka
05-19-2017
10:16 PM
I will close this case I got my answer here.
... View more
05-19-2017
05:56 PM
yes, this is the piece of info can fix my problem here. thank you so much Eyad Gareinabi. Robin
... View more
05-19-2017
02:43 PM
Anyone have info on how mongodb do the sharding within the HDP cluster? I have a mongodb installed by ambari, but when check the sharding info on mongodb, it tells me the sharding is not enabled. so when I tried to enable sharding by commands, mongodb server will be down. I have to restart all components on that server.
... View more
Labels:
- Labels:
-
Apache Ambari
05-19-2017
02:41 PM
Anyone have info on how mongodb do the sharding within the HDP cluster? I have a mongodb installed by ambari, but when check the sharding info on mongodb, it tells me the sharding is not enabled. so when I tried to enable sharding for mongodb, the server will be down. I have to restart all components on that server.
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
04-24-2017
12:41 PM
Thank you so much Dave. after some search, we may have to use kafka producer API for this complex data streaming. However, your answer give me a new thought on nifi with kafka. Thank you very much for your time to help. Robin
... View more
04-22-2017
02:21 AM
Our new project is going to collect internet search data and move to mongodb in realtime and streaming. so we are going to install kafka, not sure if it is mandatory to have flume for READ/COLLECT data from internet, Can kafka work alone to get data? In the mongodb env. seemed the mongodb sink will take care of WRITE data from kafka to mongodb. Any one have any ideas?
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Kafka
03-25-2017
07:26 PM
I have the same question on this, basically, I like to know the only way to add the client(gateway node) has to be after cluster installed? can we install all nodes + edge nodes when install Hadoop but not have any service, not NN or not DN on these edge nodes, but sqoop, nor Hue on these nodes?
... View more
03-23-2017
07:57 PM
we plan to setup a flume + kafka cluster for our realtime and spark streaming system. Questions here is that Kafka and flume are already in the cluster as a service when install hortonworks, 1. We like to have kafka server on a separate machine, not on any data node in the cluster, so how to move kafka to new machine. can I remove/add kafka on new machine from ambari? 2. We need few Kafka brokers (more machines add to this service) and all of these machines need to be configured as client node, right? 3. Is possible to add all brokers to ambari/kafka? Any idea on how to do this?
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Kafka
03-20-2017
07:21 AM
Q1, is setup triggers on each table the only way nifi capture data change? Q2, what if I have over 100 tables need to ingest to hadoop, each table need 3 triggers on insert, delete and update. so I need to setup all? Q3, how exactly nifi reads log files from RDBMS? Q4. do you think Sqoop is better tool in my case?
... View more
Labels:
- Labels:
-
Apache NiFi
-
Apache Sqoop
12-26-2016
04:01 AM
Let's say I have 10 nodes to install hadoop, and 4 nodes are for edge node to install all ML or ETL thrid party tools, Can I include all nodes in installation, only basic hadoop installation on these 4 nodes only, then install third party tools on them? 0. when start cluster installation, I can list all of 10 server for this cluster installation, right? these 4 nodes supposed to be part of hadoop cluster. 1. During the cluster installation, is any where I can assign a role 'client' to these 4 nodes? 2. if yes, please show me a link, need to see exact step to setup 'role', role as client? got to be a role for these 4 nodes. 3. if not, may I just hadoop on these nodes and simply install third party tools. I already have Ambari installed on a separated server. thank you very much for your help. Robin
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
07-31-2016
11:21 PM
yes, it was a typo. thank you very much for this one.
... View more
07-28-2016
08:59 PM
the errors showed as following: [INFO] Unable to bind key for unsupported operation: backward-delete-word [INFO] Unable to bind key for unsupported operation: backward-delete-word [INFO] Unable to bind key for unsupported operation: down-history [INFO] Unable to bind key for unsupported operation: up-history --- and many more after this
... View more
Labels:
- Labels:
-
Apache Hive
03-06-2016
09:39 PM
thank you both. yes at ambari-server setup.
... View more
03-06-2016
09:23 PM
Thanks Artern. can you let me know any where I can pick database vendor during the ambari installation. or I can go ahead to process and change the to other database vendor later?, I guess not 🙂
... View more
03-06-2016
09:12 PM
1 Kudo
I didnt see place Derby was in the list, so what I do for this warning?
... View more
Labels:
- Labels:
-
Apache Oozie
03-06-2016
08:44 PM
1 Kudo
Thanks Mr. Wonderful. yes, I can start ambari install after add HTTP role in security group configures. the default port for HTTP is 80, but 8080 still working. that is good. thank you very much. Robin
... View more
03-06-2016
06:21 AM
1 Kudo
I think we are close to resolve this problem. tell me how to add this port to my network security settings. AWS/EC2 have security group settings, I dont see the setting for port number. however, I will look into it. thank you for this valuable info. Robin
... View more
03-06-2016
06:13 AM
1 Kudo
306-neeraj.pngyou are right, I need use public IP. so I had this URL http://ec2-54-xx-xx-xx.compute-1.amazonaws.com:8080/ however, I still get this error, pls see attached image. please help.
... View more