What's New @ Cloudera

Find the latest Cloudera product news
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

[Preview] Accelerate data pipeline development with self-service, no-code Airflow authoring UI in Cloudera Data Engineering

avatar
Contributor

With Cloudera Data Engineering (CDE) Pipeline authoring UI, any CDE user irrespective of their level of Airflow expertise can create multi-step pipelines with combination of out of the box operators (CDEOperator, CDWOperator, BashOperator, PythonOperator). More advanced users can still continue to deploy their own custom Airflow DAGs (Directed Acyclic Graphs) as before, or use the Pipeline authoring UI to bootstrap their projects for further customization. And once the pipeline has been developed thru the UI, they are deployed and operationally managed through the same best in class APIs and job life-cycle management the users have come to expert from CDE.

 

Airflow has been adopted by many Cloudera Data Platform (CDP) customers in the public cloud as the next generation orchestration service to setup and operationalize complex pipelines. Until now, the setup of such pipelines still required knowledge of Airflow and the associated python configuration. As a result users tended to limit their pipeline deployments to basic time-based scheduling of Spark jobs, and steered away from more complex multi-step pipelines that are typical of data engineering workflows. The CDE Pipeline authoring UI abstracts away those complexities from users, making multi-step pipeline development self-service and point-and-click driven. Providing an easier path than before to developing, deploying, and operationalizing true end-to-end data pipelines.

 

This feature is in Preview and available on new CDE services only. When creating a Virtual Cluster a new option allows you to enable the Airflow authoring UI.