Latest version
Air Flow Pipe Sizing
Pip install flake8 flake8 airflowdbt/ tests/ setup.py Package management. If you use dbt's package manager you should include all dependencies before deploying your dbt project. For Docker users, packages specified in packages.yml should be included as part your docker image by calling dbt deps in. After releasing PIP 20.3.3 today we were able to make 2.0 compatible with the new PIP and 1.10.14 almost works (papermill extra is problematic when installing airflow using the new PIP). We will try to address it in case we release 1.10.15 but if you want to install papermill extra, please downgrade pip or use legacy resolver.
Released:
Apache Airflow integration for dbt
Project description
Pip pip install airflow-pandas Development Installation. Clone the project. Install in Anaconda3 environment; This command creates a python environment and then activates it. $ make recreatepyenv && chmod +x activate-env.sh &&. Activate-env.sh Now install the application in editable mode and you are ready to start development $ pip install -e. Your own myproviderpackage package that is independent of apache-airflow or your airflow installation, which can be a local Python package (that you install via pip install -e /path/to/my-package), a normal pip package (pip install myproviderpackage), or any other type of Python package.
This is a collection of Airflow operators to provide easy integration with dbt.
Installation
Install from PyPI:
It will also need access to the dbt
CLI, which should either be on your PATH
or can be set with the dbt_bin
argument in each operator.
Usage
There are four operators currently implemented:
DbtSeedOperator
- Calls
dbt seed
- Calls
DbtSnapshotOperator
- Calls
dbt snapshot
- Calls
DbtRunOperator
- Calls
dbt run
- Calls
DbtTestOperator
- Calls
dbt test
- Calls
Each of the above operators accept the following arguments:
profiles_dir
- If set, passed as the
--profiles-dir
argument to thedbt
command
- If set, passed as the
target
- If set, passed as the
--target
argument to thedbt
command
- If set, passed as the
dir
- The directory to run the
dbt
command in
- The directory to run the
full_refresh
- If set to
True
, passes--full-refresh
- If set to
vars
- If set, passed as the
--vars
argument to thedbt
command. Should be set as a Python dictionary, as will be passed to thedbt
command as YAML
- If set, passed as the
models
- If set, passed as the
--models
argument to thedbt
command
- If set, passed as the
exclude
- If set, passed as the
--exclude
argument to thedbt
command
- If set, passed as the
select
- If set, passed as the
--select
argument to thedbt
command
- If set, passed as the
dbt_bin
- The
dbt
CLI. Defaults todbt
, so assumes it's on yourPATH
- The
verbose
- The operator will log verbosely to the Airflow logs
Typically you will want to use the DbtRunOperator
, followed by the DbtTestOperator
, as shown earlier.
You can also use the hook directly. Typically this can be used for when you need to combine the dbt
command with another task in the same operators, for example running dbt docs
and uploading the docs to somewhere they can be served from.
Velocity Equation For Pipe Flow
Building Locally
To install from the repository:First it's recommended to create a virtual environment:
Install using pip
:
Testing
To run tests locally, first create a virtual environment (see Building Locally section)
Install dependencies:
Run the tests: Antique sewing machine bookends.
Code style
This project uses flake8.
To check your code, first create a virtual environment (see Building Locally section):
Package management
If you use dbt's package manager you should include all dependencies before deploying your dbt project.
For Docker users, packages specified in packages.yml
should be included as part your docker image by calling dbt deps
in your Dockerfile
.
License & Contributing
- This is available as open source under the terms of the MIT License.
- Bug reports and pull requests are welcome on GitHub at https://github.com/gocardless/airflow-dbt.
GoCardless ♥ open source. If you do too, come join us.
Release historyRelease notifications | RSS feed
0.3.0
Airflow Pipeline
This is a collection of Airflow operators to provide easy integration with dbt.
Installation
Install from PyPI:
It will also need access to the dbt
CLI, which should either be on your PATH
or can be set with the dbt_bin
argument in each operator.
Usage
There are four operators currently implemented:
DbtSeedOperator
- Calls
dbt seed
- Calls
DbtSnapshotOperator
- Calls
dbt snapshot
- Calls
DbtRunOperator
- Calls
dbt run
- Calls
DbtTestOperator
- Calls
dbt test
- Calls
Each of the above operators accept the following arguments:
profiles_dir
- If set, passed as the
--profiles-dir
argument to thedbt
command
- If set, passed as the
target
- If set, passed as the
--target
argument to thedbt
command
- If set, passed as the
dir
- The directory to run the
dbt
command in
- The directory to run the
full_refresh
- If set to
True
, passes--full-refresh
- If set to
vars
- If set, passed as the
--vars
argument to thedbt
command. Should be set as a Python dictionary, as will be passed to thedbt
command as YAML
- If set, passed as the
models
- If set, passed as the
--models
argument to thedbt
command
- If set, passed as the
exclude
- If set, passed as the
--exclude
argument to thedbt
command
- If set, passed as the
select
- If set, passed as the
--select
argument to thedbt
command
- If set, passed as the
dbt_bin
- The
dbt
CLI. Defaults todbt
, so assumes it's on yourPATH
- The
verbose
- The operator will log verbosely to the Airflow logs
Typically you will want to use the DbtRunOperator
, followed by the DbtTestOperator
, as shown earlier.
You can also use the hook directly. Typically this can be used for when you need to combine the dbt
command with another task in the same operators, for example running dbt docs
and uploading the docs to somewhere they can be served from.
Velocity Equation For Pipe Flow
Building Locally
To install from the repository:First it's recommended to create a virtual environment:
Install using pip
:
Testing
To run tests locally, first create a virtual environment (see Building Locally section)
Install dependencies:
Run the tests: Antique sewing machine bookends.
Code style
This project uses flake8.
To check your code, first create a virtual environment (see Building Locally section):
Package management
If you use dbt's package manager you should include all dependencies before deploying your dbt project.
For Docker users, packages specified in packages.yml
should be included as part your docker image by calling dbt deps
in your Dockerfile
.
License & Contributing
- This is available as open source under the terms of the MIT License.
- Bug reports and pull requests are welcome on GitHub at https://github.com/gocardless/airflow-dbt.
GoCardless ♥ open source. If you do too, come join us.
Release historyRelease notifications | RSS feed
0.3.0
Airflow Pipeline
0.2.0 X plane online play.
0.1.2
0.1.1
0.1.0
0.0.1
Airflow Systems Inc
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Airflow Pipenv
Filename, size | File type | Python version | Upload date | Hashes |
---|---|---|---|---|
Filename, size airflow_dbt-0.3.0-py2.py3-none-any.whl (7.2 kB) | File type Wheel | Python version py2.py3 | Upload date | Hashes |
Filename, size airflow_dbt-0.3.0.tar.gz (7.6 kB) | File type Source | Python version None | Upload date | Hashes |
Hashes for airflow_dbt-0.3.0-py2.py3-none-any.whl
Air Flow Pipe Chart
Algorithm | Hash digest |
---|---|
SHA256 | 3eb5a905acfb11b24308f94f57cf73b627eff9f9902a737dc2aab75b712685cd |
MD5 | 80ab35c407e536c53eebf2fde70a202d |
BLAKE2-256 | f84228eee2b927b15dd486e5edaab61f7661e1f1ce80a80299389be5896b3a1d |
Hashes for airflow_dbt-0.3.0.tar.gz
Algorithm | Hash digest |
---|---|
SHA256 | 22a5eac322cfb7bafbbba6b91e0c2f4b6fd3df4ff9b45a90944d04e6d58ebe21 |
MD5 | 6dd2540fd33c508699841e4aace0ba44 |
BLAKE2-256 | fb7120f07e45f17a219fdb41e93069c43774b62eacc1c94fa5afc9cf91432cbb |