Apache Airflow Training

  • Overview
  • Course Content
  • Drop us a Query

Apache Airflow was introduced by Airbnb. This open-source workflow management platform is designed for data engineering pipelines. Multisoft Systems is providing Apache Airflow Training for experienced professionals who want to add a hot & demanding technology in their resume, technology aspirants who want to switch their career from conventional schedulers, and data engineers who want to deploy their pipelines using Airflow in their company's projects. Go for this Apache Airflow Training if you want to manage the company's increasingly complex workflows effectively.

Airflow is a must-have for you if you have many ETL(s) to manage. Multisoft Systems is offering Apache Airflow Training to teach the participants everything they need to start using Apache Airflow through theory and practical videos. Our offered course is based on a few things such as the definition of Airflow, how it works, and the process of configuring Apache Airflow. However, you will also learn the techniques of creating plugins, making real dynamic pipelines, and applying advanced concepts of Apache Airflow.

Apache Airflow Course Objectives:
  • How to deal with the core functionalities such as DAGs, Tasks, Operators, and Workflows?
  • How to understand and apply advanced concepts of Apache Airflow such as XCOMs, Branching and SubDAGs?
  • What is the difference between Sequential, Local and Celery Executors
  • How to use Sequential, Local and Celery Executors?
  • How to use Apache Airflow in a Big Data ecosystem with the use of PostgreSQL, Hive, Elasticsearch?
  • How to create plugins to add functionalities to Apache Airflow?
  • How to use Docker with Airflow and different executors?
  • How to implement solutions using Airflow to real data processing problems?
  • How to install and configure Apache Airflow?
Apache Airflow Online Training
  • Recorded Videos After Training
  • Digital Learning Material
  • Course Completion Certificate
  • 24x7 After Training Support
Apache Airflow Course Target Audience
  • Data engineers who want to deploy their pipelines with the use of Airflow
  • Engineers who want to switch their careers from conventional schedulers
  • Professionals who want to add a hot & demanding technology to their resume
  • Professionals who want to learn basic and advanced concepts about Apache Airflow
Apache Airflow Course Prerequisites
  • You should have prior work experience in programming or scripting to pursue this Apache Airflow Training. You need at least 8 gigabytes of memory and VirtualBox installed. A VM of 3Gb needs to be downloaded. However, working experience in Python will help you a lot.
Apache Airflow Course Certification
  • Multisoft Systems will provide you with a training completion certificate after completing this Apache Airflow Training.

Apache Airflow Course Content

Module 1: Course Introduction 

  • Important Prerequisites 
  • Course Objectives 
  • Who am I? 
  • Development Environment

Module 2: Getting Started with Airflow

  • Introduction
  • Why Airflow?
  • What is Airflow? 
  • How Airflow works?
  • Installing Airflow 2.0
  • The CLI in Airflow 2.0
  • What you need to know about the UI

Module 3: Coding Your First Data Pipeline with Airflow 

  • Introduction
  • What is DAG? 
  • Time to code your first DAG
  • DAG Skeleton
  • What is an Operator?
  • Creating Table
  • The secret weapon!
  • The Providers
  • Is API available?
  • Extracting users
  • Processing users
  • Storing users
  • Order matters!
  • You data pipeline in action!
  • Dag scheduling
  • Backfilling and catchup

Module 4: Databases and Executors

  • Introduction
  • The default configuration 
  • Start scaling with the Local Executor
  • Scale to the infinity with the Celery Executor
  • Scaling Airflow with the Celery Executor in action!
  • Changing the executor
  • Concurrency, the parameters you must know!
  • Concurrency in practice

Module 5: Implementing Advanced Concepts in Airflow 

  • Introduction
  • Adios repetitive patterns
  • Minimising DAGs with SubDAGs
  • Adios SubDAGs, Welcome TaskGroups!
  • Sharing data between tasks with XComs
  • XComs in action!
  • Choosing a specific path in your DAG
  • Executing a task according to a condition
  • Trigger rules or how tasks get triggered
  • Changing the way your tasks are triggered

Module 6: Creating Airflow Plugins with Elasticsearch and PostgreSQL

  • Introduction
  • Installing Elasticsearch
  • How the plugin system works?
  • Creating a hook interacting with Elasticsearch
  • Creating the PostgresToElasticOperator

Module 7: Using Apache Airflow with Docker 

  • Introduction
  • Quick Reminder About Docker
  • Running Airflow on Docker with the Celery Executor
  • Running Airflow on Docker with the Local Executor

Module 8: Airflow 2.0

  • What to expect from Airflow 2.0?

A Few Things You'll Love!

What Attendees are Saying


+91 9810306956

Available 24x7


Hi there 👋

How can I help you?
Chat with Us