Skip to content

Latest commit

 

History

History
30 lines (25 loc) · 1.78 KB

README.md

File metadata and controls

30 lines (25 loc) · 1.78 KB

dbt-af Tutorial

Quick Start

Prerequisites

  1. Running instance of Airflow. There are a few ways to get this. The easiest is to use the Docker Compose to get a local instance running. See docs for more information.
  2. Install dbt-af if you are not using the Docker Compose method.
    • via pip: pip install dbt-af[tests,examples]
  3. Build dbt manifest. You can use the provided script to build the manifest.
cd examples/dags
./build_manifest.sh
  1. Add dbt_dev and dbt_sensor_pool pools to Airflow.

    • By using Airflow UI Airflow Pools
    • By using Airflow CLI: airflow pools set dbt_dev 4 "dev"

    Start with some small numbers of open slots in pools. If you are using your local machine, a large number of tasks can overflow your machine's resources.

List of Examples

  1. Basic Project: a single domain, small tests, and a single target.
  2. Advanced Project: several domains, medium and large tests, and different targets.
  3. Dependencies management: how to manage dependencies between models in different domains.
  4. Manual scheduling: domains with manual scheduling.
  5. Maintenance and source freshness: how to manage maintenance tasks and source freshness.
  6. Kubernetes tasks: how to run dbt models in Kubernetes.
  7. Integration with other tools: how to integrate dbt-af with other tools.
  8. [Preview] Extras and scripts: available extras and scripts.