Data orchestration is the process of taking siloed data from multiple data storage locations, combining and organizing it, and making it available to your developers, data engineers, and data scientists. This enables businesses to automate and streamline data-driven decision making. Apache Airflow is an open source orchestration tool that helps you to programmatically create workflows in Python that help you run, schedule, monitor and mange data engineering pipelines - no more manually managing those cron jobs! In this session, we will take a look at the architecture of Apache Airflow, and then show you how to create and deploy a typical workflow. You will see how you can use the open source provider libraries to simplify your workflows when creating an end to end data pipeline. Expect lots of code and demos in this session. [15 min talk/presentation, 20-30 min demo]