How to git-sync DAGs in dynamically created KubernetesExecutor worker pods

I am new to Airflow and am thus facing some issues. I am working on Airflow, and have successfully deployed it on Celery Executor on AKS. Now I am trying to deploy Airflow using Kubernetes Executor on Azure Kubernetes Service. I am using the helm chart provided by tekn0ir for the purpose with some modifications to it. I used kubectl and managed to deploy it successfully. It has pods for scheduler, webserver, postgresql & dynamically created pods for running task-instances. For the purpose of synchronizing dags, I used a git-init container which successfully syncs dags on both scheduler as well as web server. However, when, I trigger a DAG, a new pod does get successfully created for running the task instance, but it throws error. I viewed the logs for that pod & found that the dags were probably not synchronized on the worker pods:

How can I resolve this issue, so that the DAGs get synchronized in worker pods (which are dynamically created by the Kubernetes Executor & thus have no deployment file, where we can mount dags using a git sync container)?

Hi Naman,

Two questions: 1. Have you considered just baking the DAGs into the docker image and 2: Can you point me to this helm chart? I’m wondering if the airflow paths might be different on the workers.

As far, prebaking in docker images is concerned, the docker image being used is also from a dockerhub image by tekn0ir, so I didn’t have the opportunity to change the dockerfile. Also I am frequently working with dags, so git-sync seems a better option than pre-baked mode to me.
This is the link to the helm chart : https://github.com/tekn0ir/airflow-chart.
Thanks in advance