Hi All, I would like to implement a real time data feed between a webserver and hadoop server. I plan to use flume to read the web log files real time and target is hdfs/Hive,
1. I need a checklist of what to prepare for the security like, firewalls etc.
2. Are there any hadoop agent I need to install in the webser server
3. Once data is available now in hive, I will have a regular job to process the data using Impala then once processed I will have a list of suggestions/messages for a particular web user. How do I send the info back to that specific web users web page?
Implemented Flume to fetch the weblogs that was transferred to the Hadoop edge server upto HDFS.
Also, due to firewall challenges and security implementations and lack of test environment, used an alternative solution of using Zena job scheduler to transfer the log files from ATM machines and mobile web app logs to hadoop edge server.
Kafka came as a big challenge since we are using LDAP thus security and authentication issues quickly cropped up.
Kudus to your suggestion!