Reply
Contributor
Posts: 36
Registered: ‎01-08-2017

Re: How to create Etl job in spark

So you mean i have to use third party tool like pentaho, informatica etc in order to create etl job ?

Contributor
Posts: 36
Registered: ‎01-08-2017

Re: How to create Etl job in spark

[ Edited ]

So you mean i have to use third party tool like pentaho,informatica etc for etl job ? I thought like there is a features available inside hadoop itself for creating job like mentioned in this url but my understanding were wrong then right ?

 

https://drive.google.com/open?id=0B-wEtRLWeFvMMGt1LWJUbURsTDA

 

 

Posts: 642
Topics: 3
Kudos: 121
Solutions: 67
Registered: ‎08-16-2016

Re: How to create Etl job in spark

In the open source community the closest is NiFi, but that is its own project and not part of the Apache Hadoop project. It would be your responsibility to integrate and manage it. Cloudera assumes this responsibility for you when they choose to support a project in CDH. Cloudera currently does not support NiFi but may in the future.

So no, there is no way to build ETL jobs using a GUI.