Otavio

“The learning paths on Dataquest are incredible. They give you a direction through the learning process – you don’t have to guess what to learn next.”

Otávio Silveira

Data Analyst @ Hortifruti

Course overview

Manual scripts and cron jobs break down as data pipelines grow complex. Apache Airflow brings order to chaos through workflow orchestration—ensuring tasks run in the right order, at the right time, with proper failure handling and monitoring. This course teaches you to build production-grade data pipelines the way professional teams do. You’ll start by understanding orchestration concepts and Airflow’s architecture, then deploy a complete Airflow environment in Docker. Using the TaskFlow API, you’ll build increasingly sophisticated workflows: from simple ETL processes to pipelines with dynamic parallel processing and database connections. You’ll integrate Git-based version control and GitHub Actions CI/CD for automated deployment. Finally, you’ll build a real-world pipeline that scrapes Amazon book data, cleans it with Python, and loads it into MySQL on a schedule—complete with monitoring and alerting. By the end, you’ll have the skills to orchestrate complex data workflows reliably at scale.

Key skills

  • Understanding workflow orchestration and how Airflow structures pipelines through DAGs and tasks
  • Deploying Airflow in Docker for development and production-like environments
  • Writing clean, maintainable DAGs using the TaskFlow API
  • Implementing parallel processing with Dynamic Task Mapping for scalable workflows
  • Managing database connections and credentials securely in Airflow
  • Setting up version control and automated deployment for DAGs using Git and GitHub Actions
  • Building complete ETL pipelines with monitoring, retries, and alerting
  • Extracting data from real-world APIs and loading it into databases on automated schedules

Course outline

Building Data Pipelines with Apache Airflow [4 lessons]

Introduction to Apache Airflow 2h

Lesson Objectives
  • Understand workflow orchestration and its role in data pipelines
  • Explain Apache Airflow's core components: DAGs, tasks, and scheduler
  • Navigate and interpret the Airflow Web UI effectively
  • Differentiate between Airflow orchestration and traditional cron scheduling
  • Identify common Airflow use cases: ETL, ML, and analytics

Running and Managing Apache Airflow with Docker (Part I) 2h

Lesson Objectives
  • Deploy Apache Airflow using Docker Compose efficiently
  • Build ETL pipelines with TaskFlow API decorators
  • Implement dynamic task mapping for parallel processing
  • Debug workflows using Airflow Web UI and logs
  • Configure DAG scheduling, retries, and dependency management

Running and Managing Apache Airflow with Docker (Part II) 2h

Lesson Objectives
  • Complete ETL lifecycle by adding Load phase to pipeline
  • Configure and manage MySQL database connections in Airflow
  • Switch from CeleryExecutor to LocalExecutor for simplified architecture
  • Implement Git-based DAG management using git-sync for automation
  • Set up CI/CD validation pipeline with GitHub Actions

Automating Amazon Book Data Pipelines with Apache Airflow and MySQL 2h

Lesson Objectives
  • Build real-world ETL pipelines using Apache Airflow TaskFlow API
  • Extract and transform live data from web sources systematically
  • Implement Git-based DAG management with automated git-sync integration
  • Configure CI/CD validation workflows using GitHub Actions pipelines
  • Design production-ready data pipelines with monitoring and alerting

The Dataquest guarantee

Guarantee

Dataquest has helped thousands of people start new careers in data. If you put in the work and follow our path, you’ll master data skills and grow your career.

Money

We believe so strongly in our paths that we offer a full satisfaction guarantee. If you complete a career path on Dataquest and aren’t satisfied with your outcome, we’ll give you a refund.

Master skills faster with Dataquest

Go from zero to job-ready

Go from zero to job-ready

Learn exactly what you need to achieve your goal. Don’t waste time on unrelated lessons.

Build your project portfolio

Build your project portfolio

Build confidence with our in-depth projects, and show off your data skills.

Challenge yourself with exercises

Challenge yourself with exercises

Work with real data from day one with interactive lessons and hands-on exercises.

Showcase your path certification

Showcase your path certification

Share the evidence of your hard work with your network and potential employers.

Grow your career with
Dataquest.

98%
of learners recommend
Dataquest for career advancement
4.85
Dataquest rating
SwitchUp Best Bootcamps
$30k
Average salary boost
for learners who complete a path
Aaron

Aaron Melton

Business Analyst at Aditi Consulting

“Dataquest starts at the most basic level, so a beginner can understand the concepts. I tried learning to code before, using Codecademy and Coursera. I struggled because I had no background in coding, and I was spending a lot of time Googling. Dataquest helped me actually learn.”

Jessi

Jessica Ko

Machine Learning Engineer at Twitter

“I liked the interactive environment on Dataquest. The material was clear and well organized. I spent more time practicing then watching videos and it made me want to keep learning.”

Victoria

Victoria E. Guzik

Associate Data Scientist at Callisto Media

“I really love learning on Dataquest. I looked into a couple of other options and I found that they were much too handhold-y and fill in the blank relative to Dataquest’s method. The projects on Dataquest were key to getting my job. I doubled my income!”

Join 1M+ data learners on
Dataquest.

1

Create a free account

2

Choose a learning path

3

Complete exercises and projects

4

Advance your career

Start learning today