Apache Airflow Sequential Executor
The Sequential Executor is an executor in Apache Airflow that runs tasks one at a time, in the order that they are scheduled. The Sequential Executor is the default executor in Airflow and is well-suited for simple pipelines that do not require parallel execution.
To use the Sequential Executor, you do not need to do anything special. Simply define your tasks and specify the dependencies between them, and the Sequential Executor will execute the tasks in the correct order.
Example of Sequential Executor
Here is an example of a DAG that uses the Sequential Executor:
from airflow import DAG
from airflow.operators.bash import BashOperator
dag = DAG(
'my_dag',
default_args={
'owner': 'me',
},
schedule_interval=None,
)
task1 = BashOperator(
task_id='task1',
bash_command='echo "Hello, World!"',
dag=dag,
)
task2 = BashOperator(
task_id='task2',
bash_command='echo "Goodbye,
World!"',
dag=dag,
)
task1 >> task2
In this example, the task1
operator will be executed first, followed by the task2
operator.
More Detail
Here are a few more things to consider when using the Sequential Executor:
- The Sequential Executor does not support parallel execution of tasks. If you have tasks that can be run concurrently, you may want to consider using a different executor, such as the Local Executor or the Celery Executor.
- The Sequential Executor does not have any special configuration options. To use the Sequential Executor, you simply need to define your tasks and their dependencies as usual.
- The Sequential Executor is well-suited for simple pipelines that do not require parallel execution. If you have a complex pipeline with many tasks that can be run concurrently, you may want to consider using a different executor to take advantage of parallelism.