DAG runs fine first time but whole Airflow service gets really slow after


I’m experiencing a slow down each time I run my DAG. I’m not sure what is happening but I fear I may be reaching the resource limit but not sure how to check.

Essentially, I have created a dag that uses selenium to do some web automation. It works perfectly fine the first time but then when I try to manually run the DAG a second time, the whole Airflow service in Docker (locally) seems to hang. I can hardly load anything, and eventually, I will just get a web error like this “This page isn’t working - Localhost didn’t send any data”.

Has anyone experienced this before? After the DAG run it seems to get sluggish until it just stops working entirely, slow load times, etc. The only way to stop this is to redeploy the docker instance.

Thanks for any help in advance!

Looks like the issue was a memory leak. I was using xvfb to create a virtual display so I didn’t have to use chromedriver in headless mode. It’s at this point I was missing a piece of code that shutoff the display. So instead each time I ran the DAG the memory would eventually reach it’s max as there was nothing clearing the virtual displays.