Hi,
1- I saw in the course that airflow is not a data streaming solution and not a data processing framework. For example:
if i want to take 15000 record of a table in my postgres database or take data from a csv file. Instead of create python function in the dag for processing data, is it better to create another python file for connecte to data and processing it ?
2- in the dags directory i have calculate_dag.py. Is it a good way to create others directories to implement classes and functions, and after to call them in the calculate.py ?
Thank you.