Member since
10-17-2018
6
Posts
0
Kudos Received
0
Solutions
02-27-2020
11:43 PM
Thanks. Implemented Flume to fetch the weblogs that was transferred to the Hadoop edge server upto HDFS. Also, due to firewall challenges and security implementations and lack of test environment, used an alternative solution of using Zena job scheduler to transfer the log files from ATM machines and mobile web app logs to hadoop edge server. Kafka came as a big challenge since we are using LDAP thus security and authentication issues quickly cropped up. Kudus to your suggestion!
... View more
12-09-2018
11:26 PM
Hi All, I would like to implement a real time data feed between a webserver and hadoop server. I plan to use flume to read the web log files real time and target is hdfs/Hive,
Questions are:
1. I need a checklist of what to prepare for the security like, firewalls etc.
2. Are there any hadoop agent I need to install in the webser server
3. Once data is available now in hive, I will have a regular job to process the data using Impala then once processed I will have a list of suggestions/messages for a particular web user. How do I send the info back to that specific web users web page?
Thank you
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Hive
-
Apache Impala
-
HDFS
10-18-2018
12:00 AM
link is no longer available. Please advise.... ThankS!
... View more