run
Usage
$ airflow tasks run [-h] [--cfg-path CFG_PATH] [-d {ignore,wait,check}] [-f]
[-A] [-i] [-I] [-N] [-l] [--map-index MAP_INDEX] [-m]
[-p PICKLE] [--pool POOL] [--read-from-db] [--ship-dag]
[-S SUBDIR] [-v]
dag_id task_id execution_date_or_run_id
Arguments
| Parameter | Description |
|---|---|
dag_id |
ID of the DAG |
task_id |
ID of the task |
execution_date_or_run_id |
The |
--cfg-path |
Path to config file to use instead of airflow.cfg |
-d, --depends-on-past |
Determine how Airflow should deal with past dependencies. Possible values:
|
-f, --force |
Ignore previous task instance state, re-run regardless if task has already succeeded or failed |
-A, --ignore-all-dependencies |
Ignores all non-critical dependencies, including |
-i, --ignore-dependencies |
Ignore task-specific dependencies, e.g. upstream, depends_on_past, and retry delay dependencies |
-I, --ignore-depends-on-past |
Deprecated. Use |
-N, --interactive |
Do not capture standard output and error streams (useful for interactive debugging) |
-l, --local |
Run the task using the LocalExecutor |
--map-index |
Mapped task index |
-m, --mark-success |
Mark jobs as succeeded without running them |
-p, --pickle |
Serialized pickle object of the entire DAG (used internally) |
--pool |
Resource pool to use |
--read-from-db |
Read DAG from DB instead of the DAG file |
--ship-dag |
Pickles (serializes) the DAG and ships it to the worker |
-S, --subdir |
File or directory location for DAGs. Defaults to |
-h, --help |
Print out the help manual for this command |
-v, --verbose |
Make logging output more verbose |