08-28-2018 05:59 AM
09-03-2018 09:25 PM - edited 09-03-2018 09:28 PM
I like to think of myself and an information guy. I started with Informix back in the day and started working with tools like crystal reports before moving to cognos and SAS.
I realised that the delivery of information was always critically pinned to the underlying data and inputs so I kept going back to the data, I became and Oracle DBA for a while before brushing up on programming and getting into Java.
I always liked open source and worked for a while on alot of open source-based tools like Jaspersoft and Talend before moving into Apache Spark and discovering "big data".
Today I'm enjoying the wold of Cloudera bringing all the tools I like together and making it quicker and easier to bring data in and make information available to the people that need it.
I am active int he Melbourne Java User Group and feel engaged in the Melbourne Scala Group as well. I have done a range of projects in Python and R which are both great languages, but the JVM feels like home.
09-12-2018 04:21 AM
Feels great to be party of the mighty Cloudera! here's my quick intro:
Qualification : Bachelor of Engineering(Chennai|India)
IT Industry Experience : 5.4 years as of September 2018
Areas of Interest : Big Data Engineering & solutions
Skillset : HDFS, MR, Pig, HIve, Kafka, Storm, Hbase, Spark
Curent Role : Technology Analyst @ Infosys Limited
Delivered 2 enterprise projects so far using BD techs,
1. System to generate Purchase Pattern & Fraud Detection reports for a Retail Client using Hive for analysis, Hbase for data enhancement, Sqoop for report export and Oozie for scheduling
2. Taxation portal of a country - used Kafka Cluster messaging, Storm processing and Hbase as tech stack.
1. Exploring the processing & analystical features of various BD techs.
2. Reading, Travelling & Tri-athlete.
3. Developing miscellaneous apps on Android Studio
10-08-2018 05:47 AM
Since this is an english speaking forum, allow me to post the translation of your message. Hopefully google got it right.
Hello everyone, I am from China. Just contacted hadoop for two years. Mainly responsible for the operation and maintenance of big data clusters. Welcome a lot of exchanges.