Concurrent dag runs - xcom variables


i’m trying to use airflow for concurrent DAG runs, where each dag run will have dynamic configuration file and separate for each run.
I’m facing issue when using xcom variables, since it’s scope limited to batch dag run.

i’m looking for alternatives to achieved concurrent dag runs with separate configuration for each run and this file will be used for intermediate tasks.

Have you considered writing the output of tasks to canonical data lake (S3 or GCS buckets), and reading from that on your next DAG run. This also allows for tasks to be idempotent.

I’m using on MapR hadoop cluster, AWS and google cloud will not be there

I’ve never used Hadoop, so maybe jumping over to Airflow Slack you can get a better answer: