Connect to Airflow via CLI
Overview
Airflow provides a command line interface implemented as the bin/airflow script located in the /opt/airflow/bin directory.
While it is recommended to manage Airflow via ADCM or the Airflow web UI, the CLI lets you access an extended number of additional options for working with Airflow.
To interact with Airflow via CLI, connect to a cluster host with an Airflow component via SSH and execute the script by running:
$ sudo /opt/airflow/bin/airflow
The output looks like this:
Usage: airflow [-h] GROUP_OR_COMMAND ... Positional Arguments: GROUP_OR_COMMAND Groups: celery Celery components config View configuration connections Manage connections dags Manage DAGs db Database operations jobs Manage jobs kubernetes Tools to help run the KubernetesExecutor pools Manage pools providers Display providers roles Manage roles tasks Manage tasks users Manage users variables Manage variables Commands: cheat-sheet Display cheat sheet dag-processor Start a standalone Dag Processor instance info Show information about current Airflow and environment kerberos Start a kerberos ticket renewer plugins Dump information about loaded plugins rotate-fernet-key Rotate encrypted connection credentials and variables scheduler Start a scheduler instance standalone Run an all-in-one copy of Airflow sync-perm Update permissions for existing roles and optionally DAGs triggerer Start a triggerer instance version Show the version webserver Start a Airflow webserver instance Options: -h, --help show this help message and exit
The detailed description of each command is available in the Airflow2 CLI command reference.
For example, check the version by running:
$ sudo /opt/airflow/bin/airflow version
The output:
2.6.3
You can display help for a specific command by using the --help
option, for example:
$ sudo /opt/airflow/bin/airflow dags --help
The commands have the following syntax:
$ airflow [GROUP_OR_COMMAND] [SUB_COMMAND] [OPTIONS]
Where:
-
GROUP_OR_COMMAND
— an Airflow CLI command or a group of commands likedags
,jobs
, etc. -
SUB_COMMAND
— a command related to a group, for example,dags list
. -
OPTIONS
— the selected command’s options.
Usage examples
Airflow provides several tools, one of which is the dags
command, which allows you to manage DAGs using the CLI.
For example, to get a list of existing DAGs, run:
$ sudo /opt/airflow/bin/airflow dags list
Example output:
dag_id | filepath | owner | paused ========================================================+====================================================================================+=========+======= dataset_consumes_1 | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_datasets.py | airflow | None dataset_consumes_1_and_2 | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_datasets.py | airflow | None dataset_consumes_1_never_scheduled | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_datasets.py | airflow | None dataset_consumes_unknown_never_scheduled | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_datasets.py | airflow | None dataset_produces_1 | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_datasets.py | airflow | None dataset_produces_2 | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_datasets.py | airflow | None example_bash_operator | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_bash_operat | airflow | None | or.py | | example_branch_datetime_operator | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_branch_date | airflow | None | time_operator.py | | example_branch_datetime_operator_2 | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_branch_date | airflow | None | time_operator.py | | example_branch_datetime_operator_3 | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_branch_date | airflow | None | time_operator.py | | example_kubernetes_executor | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_kubernetes_ | airflow | None | executor.py | | example_local_kubernetes_executor | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_local_kuber | airflow | None | netes_executor.py | | example_nested_branch_dag | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_nested_bran | airflow | None | ch_dag.py | | example_params_trigger_ui | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/example_params_trig | airflow | None | ger_ui.py | | tutorial | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/tutorial.py | airflow | None tutorial_dag | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/tutorial_dag.py | airflow | None tutorial_taskflow_api | /opt/airflow/lib/python3.10/site-packages/airflow/example_dags/tutorial_taskflow_a | airflow | None | pi.py | |
To export the list of DAGs to a file, use the --output
option and provide one of the following formats as the argument: json
, yaml
, plain
, or table
.
To run a DAG, use the trigger
command:
$ sudo /opt/airflow/bin/airflow dags trigger tutorial
Here tutorial
is the ID of the DAG to run.