SparkSubmitOperator

Having trouble getting the spak modules recognize. It’s not highlighted as an error in my code (VSC) but airflow wont bring up the DAG stating that this line: airflow.providers.apache.spark.operators.spark_submit import SparkSubmitOperator
Has an error because:
ModuleNotFoundError: No module named ‘airflow.providers.apache’.
However, I have it installed and it shows in my requirements.txt when i pip freeze:
apache-airflow==2.10.2

apache-airflow-providers-apache-spark==4.11.0

apache-airflow-providers-common-compat==1.2.0

apache-airflow-providers-common-io==1.4.1

apache-airflow-providers-common-sql==1.17.0

apache-airflow-providers-fab==1.4.0

apache-airflow-providers-ftp==3.11.1

apache-airflow-providers-http==4.13.1

apache-airflow-providers-imap==3.7.0

apache-airflow-providers-postgres==5.13.0

apache-airflow-providers-smtp==1.8.0

apache-airflow-providers-sqlite==3.9.0

Airflow also won’t recognize pyspark:
from pyspark.sql import SparkSession
ModuleNotFoundError: No module named ‘pyspark’

I haven’t run my spark submit task yet but have coded it up:
spark_submit_task = SparkSubmitOperator(
task_id=‘spark_job’,
application=‘path/to/your/spark/application.py’,
conn_id=‘spark_default’,
executor_cores=1,
executor_memory=‘2g’,
num_executors=1,
driver_memory=‘2g’,
verbose=False,
spark_bin=‘/usr/local/spark/bin/spark-submit’
)

And I’m using airflow in a docker-compose.yml file that includes a spark image as one of the services:
spark:
image: apache/spark:latest
restart: unless-stopped # Restart container if it exits unexpectedly
ports:
- “7077:7077” # Expose Spark Master UI port (optional)
- “8080:8080” # Expose Spark History Server UI port (optional)