Apache Airflow version
2.6.1
What happened
We are running 40 dags of the same kind. All are started at 05:00.
After upgrade to 2.6.1 sometimes randomly dags are not scheduled and there are no created dag-runs.

What you think should happen instead
I would like to see a green column for the next period.
If there is something failing I would expect to see an error or at least warning message somewhere.
How to reproduce
It happens after the upgrade from 2.3.2.
Operating System
Ubuntu
Versions of Apache Airflow Providers
❯ ./airflow.sh info
[+] Running 2/0
⠿ Container airflow-redis-1 Running 0.0s
⠿ Container airflow-airflow-init-1 Created 0.0s
[+] Running 2/2
⠿ Container airflow-redis-1 Healthy 0.5s
⠿ Container airflow-airflow-init-1 Started 1.1s
Apache Airflow
version | 2.6.1
executor | CeleryExecutor
task_logging_handler | airflow.utils.log.file_task_handler.FileTaskHandler
sql_alchemy_conn | postgresql+
| ow_us_east_1
dags_folder | /opt/airflow/dags
plugins_folder | /opt/airflow/plugins
base_log_folder | /opt/airflow/logs
remote_base_log_folder |
System info
OS | Linux
architecture | x86_64
uname | uname_result(system='Linux', node='da226ebb4b23', release='4.14.313-235.533.amzn2.x86_64', version='#1 SMP Tue Apr 25 15:24:19 UTC 2023',
| machine='x86_64', processor='')
locale | ('en_US', 'UTF-8')
python_version | 3.7.16 (default, May 3 2023, 10:31:59) [GCC 10.2.1 20210110]
python_location | /usr/local/bin/python
Tools info
git | NOT AVAILABLE
ssh | OpenSSH_8.4p1 Debian-5+deb11u1, OpenSSL 1.1.1n 15 Mar 2022
kubectl | NOT AVAILABLE
gcloud | NOT AVAILABLE
cloud_sql_proxy | NOT AVAILABLE
mysql | mysql Ver 8.0.33 for Linux on x86_64 (MySQL Community Server - GPL)
sqlite3 | 3.34.1 2021-01-20 14:10:07 10e20c0b43500cfb9bbc0eaa061c57514f715d87238f4d835880cd846b9ealt1
psql | psql (PostgreSQL) 15.3 (Debian 15.3-1.pgdg110+1)
Paths info
airflow_home | /opt/airflow
system_path | /root/bin:/home/airflow/.local/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
python_path | /home/airflow/.local/bin:/usr/local/lib/python37.zip:/usr/local/lib/python3.7:/usr/local/lib/python3.7/lib-dynload:/home/airflow/.local/lib/python3.7
| /site-packages:/usr/local/lib/python3.7/site-packages:/opt/airflow/dags:/opt/airflow/config:/opt/airflow/plugins
airflow_on_path | True
Providers info
apache-airflow-providers-amazon | 8.0.0
apache-airflow-providers-celery | 3.1.0
apache-airflow-providers-cncf-kubernetes | 6.1.0
apache-airflow-providers-common-sql | 1.4.0
apache-airflow-providers-docker | 3.6.0
apache-airflow-providers-elasticsearch | 4.4.0
apache-airflow-providers-ftp | 3.3.1
apache-airflow-providers-google | 10.0.0
apache-airflow-providers-grpc | 3.1.0
apache-airflow-providers-hashicorp | 3.3.1
apache-airflow-providers-http | 4.3.0
apache-airflow-providers-imap | 3.1.1
apache-airflow-providers-microsoft-azure | 6.0.0
apache-airflow-providers-mysql | 5.0.0
apache-airflow-providers-odbc | 3.2.1
apache-airflow-providers-postgres | 5.4.0
apache-airflow-providers-redis | 3.1.0
apache-airflow-providers-sendgrid | 3.1.0
apache-airflow-providers-sftp | 4.2.4
apache-airflow-providers-slack | 7.2.0
apache-airflow-providers-snowflake | 4.0.5
apache-airflow-providers-sqlite | 3.3.2
apache-airflow-providers-ssh | 3.6.0
Deployment
Docker-Compose
Deployment details
I ran airflow from docker-compose.
Anything else
When I manually pause and unpause the dag nothing happens.
In the audit log there is no information of anyone trying to run the dag.
In all the postgres tables there are no created entries/rows for the failing dag for the missing dates.
There are no logs created for the missing days.
There are no errors in the other log files.
I tried to allocate a lot of memory in a container and it works.
I added swap file but it looks it has been never used.
The tasks are running dbt
For dag processor I see from time to time some PID
File Path PID Runtime # DAGs # Errors Last Runtime Last Run
------------------------------------------------------------------------ ----- --------- -------- ---------- -------------- -------------------
/opt/airflow/dags/insights/DAGNAME.py 3710 0.53s 1 0 1.60s 2023-06-02T05:00:04
/opt/airflow/dags/insights/DAGNAME.py 7023 0.00s 1 0 0.62s 2023-06-02T05:29:44
/opt/airflow/dags/insights/DAGNAME.py 7095 0.01s 1 0 0.30s 2023-06-02T05:30:14
/opt/airflow/dags/insights/DAGNAME.py 24199 0.02s 1 0 0.43s 2023-06-02T07:43:08
/opt/airflow/dags/insights/DAGNAME.py 30931 0.00s 1 0 0.35s 2023-06-02T08:35:18
/opt/airflow/dags/insights/DAGNAME.py 4938 0.01s 1 0 0.29s 2023-06-02T09:25:29
/opt/airflow/dags/insights/DAGNAME.py 11316 0.46s 1 0 0.94s 2023-06-02T10:15:12
/opt/airflow/dags/insights/DAGNAME.py 11377 0.01s 1 0 0.48s 2023-06-02T10:15:44
/opt/airflow/dags/insights/DAGNAME.py 18417 0.01s 1 0 0.40s 2023-06-02T11:10:28
/opt/airflow/dags/insights/DAGNAME.py 23364 0.01s 1 0 0.48s
in the scheduler log i see nothing is schedulled at the expected time
[2023-06-02T04:59:32.427+0000] {logging_mixin.py:149} INFO - [2023-06-02T04:59:32.427+0000] {dag.py:3490} INFO - Setting next_dagrun for DAGNAME to 2023-06-02T05:00:00+00:00, run_after=2023-06-02T05:00:00+00:00
[2023-06-02T04:59:32.457+0000] {processor.py:179} INFO - Processing /opt/airflow/dags/insights/DAGNAME.py took 0.380 seconds
[2023-06-02T05:00:03.365+0000] {processor.py:157} INFO - Started process (PID=3583) to work on /opt/airflow/dags/insights/DAGNAME.py
[2023-06-02T05:00:03.372+0000] {processor.py:826} INFO - Processing file /opt/airflow/dags/insights/DAGNAME.py for tasks to queue
[2023-06-02T05:00:03.372+0000] {logging_mixin.py:149} INFO - [2023-06-02T05:00:03.372+0000] {dagbag.py:541} INFO - Filling up the DagBag from /opt/airflow/dags/insights/DAGNAME.py
[2023-06-02T05:00:04.315+0000] {processor.py:836} INFO - DAG(s) dict_keys(['DAGNAME']) retrieved from /opt/airflow/dags/insights/DAGNAME.py
[2023-06-02T05:00:04.430+0000] {logging_mixin.py:149} INFO - [2023-06-02T05:00:04.430+0000] {dag.py:2726} INFO - Sync 1 DAGs
[2023-06-02T05:00:04.611+0000] {logging_mixin.py:149} INFO - [2023-06-02T05:00:04.611+0000] {dag.py:3490} INFO - Setting next_dagrun for DAGNAME to 2023-06-03T05:00:00+00:00, run_after=2023-06-03T05:00:00+00:00
[2023-06-02T05:00:04.941+0000] {processor.py:179} INFO - Processing /opt/airflow/dags/insights/DAGNAME.py took 1.580 seconds
[2023-06-02T05:02:54.062+0000] {processor.py:157} INFO - Started process (PID=3710) to work on /opt/airflow/dags/insights/DAGNAME.py
[2023-06-02T05:02:54.094+0000] {processor.py:826} INFO - Processing file /opt/airflow/dags/insights/DAGNAME.py for tasks to queue
[2023-06-02T05:02:54.095+0000] {logging_mixin.py:149} INFO - [2023-06-02T05:02:54.094+0000] {dagbag.py:541} INFO - Filling up the DagBag from /opt/airflow/dags/insights/DAGNAME.py
[2023-06-02T05:02:57.813+0000] {processor.py:836} INFO - DAG(s) dict_keys(['DAGNAME']) retrieved from /opt/airflow/dags/insights/DAGNAME.py
[2023-06-02T05:02:59.602+0000] {logging_mixin.py:149} INFO - [2023-06-02T05:02:59.602+0000] {dag.py:2726} INFO - Sync 1 DAGs
[2023-06-02T05:03:00.611+0000] {logging_mixin.py:149} INFO - [2023-06-02T05:03:00.611+0000] {dag.py:3490} INFO - Setting next_dagrun for DAGNAME to 2023-06-03T05:00:00+00:00, run_after=2023-06-03T05:00:00+00:00
[2023-06-02T05:03:01.444+0000] {processor.py:179} INFO - Processing /opt/airflow/dags/insights/DAGNAME.py took 7.531 seconds
[2023-06-02T05:03:31.833+0000] {processor.py:157} INFO - Started process (PID=3777) to work on /opt/airflow/dags/insights/DAGNAME.py
[2023-06-02T05:03:31.835+0000] {processor.py:826} INFO - Processing file /opt/airflow/dags/insights/DAGNAME.py for tasks to queue
Are you willing to submit PR?
Code of Conduct
Apache Airflow version
2.6.1
What happened
We are running 40 dags of the same kind. All are started at 05:00.

After upgrade to 2.6.1 sometimes randomly dags are not scheduled and there are no created dag-runs.
What you think should happen instead
I would like to see a green column for the next period.
If there is something failing I would expect to see an error or at least warning message somewhere.
How to reproduce
It happens after the upgrade from 2.3.2.
Operating System
Ubuntu
Versions of Apache Airflow Providers
Deployment
Docker-Compose
Deployment details
I ran airflow from docker-compose.
Anything else
When I manually pause and unpause the dag nothing happens.
In the audit log there is no information of anyone trying to run the dag.
In all the postgres tables there are no created entries/rows for the failing dag for the missing dates.
There are no logs created for the missing days.
There are no errors in the other log files.
I tried to allocate a lot of memory in a container and it works.
I added swap file but it looks it has been never used.
The tasks are running dbt
For dag processor I see from time to time some PID
in the scheduler log i see nothing is schedulled at the expected time
Are you willing to submit PR?
Code of Conduct