i’m trying to use airflow for concurrent DAG runs, where each dag run will have dynamic configuration file and separate for each run.
I’m facing issue when using xcom variables, since it’s scope limited to batch dag run.
i’m looking for alternatives to achieved concurrent dag runs with separate configuration for each run and this file will be used for intermediate tasks.
Have you considered writing the output of tasks to canonical data lake (S3 or GCS buckets), and reading from that on your next DAG run. This also allows for tasks to be idempotent.