.. _dbt-airflow-concepts: Similar dbt & Airflow concepts ============================== While dbt is an open source tool for data transformations and analysis, using SQL, Airflow focuses on being a platform for the development, scheduling and monitoring of batch-oriented workflows, using Python. Although both tools have many differences, they also share similar concepts. This page aims to list some of these concepts and help those who may be new to Airflow or dbt and are considering to use Cosmos. .. table:: :align: left :widths: auto =================================================================================================== ==================================================================================================== ==================================================================================== ====================================================================================================================================================== Airflow naming dbt naming Description Differences =================================================================================================== ==================================================================================================== ==================================================================================== ====================================================================================================================================================== `DAG `_ `Workflow `_ Pipeline (Direct Acyclic Graph) that contains a group of steps Airflow expects upstream tasks to have passed to run downstream tasks. dbt can run a subset of tasks assuming upstream tasks were run. `Task `_ `Node `_ Step within a pipeline (DAG or workflow) In dbt, these are usually transformations that run on a remote database. In Airflow, steps can be anything, running locally in Airflow or remotely. `Language `_ `Language `_ Programming or declarative language used to define pipelines and steps. In dbt, users write SQL, YML and Python to define the steps of a pipeline. Airflow expects steps and pipelines are written in Python. `Variables `_ `Variables `_ Key-value configuration that can be used in steps and avoids hard-coded values `Templating `_ `Macros `_ Jinja templating used to access variables, configuration and reference steps dbt encourages using jinja templating for control structures (if and for). Native in Airflow/Python, used to define variables, macros and filters. `Connection `_ `Profile `_ Configuration to connect to databases or other services `Providers `_ `Adapter `_ Additional Python libraries that support specific databases or services =================================================================================================== ==================================================================================================== ==================================================================================== ======================================================================================================================================================