I am running a dag in which I have a task which is extracting data from svn. So this task is running for 3 hours properly after that the task shows error-the job kill itself most likely due to out of memory

I am running a task in airflow dag and task is extracting data from SVN. The task is running for 3-4 hours properly.and after it is showing out of memory error.
The error can be seen in screenshot. When I run this python function on local system it is ultilizing 2.9 gb memory.
My airflow docker container has 12 gb memory limit.