DAG still loading load_examples and DAG not updated, and why airflow db init / reset using sqlite?

load_examples = False but after

airflow db reset
airflow db init

pkill -9 “airflow webserver”
pkill -9 “gunicron”
pkill -9 “airflow scheduler”

Still loads all examples, i.e new user defined DAGS not picked either

executor = CeleryExecutor
sql_alchemy_conn = postgresql+psycopg2://airflow:airflow@127.0.0.1:5432/airflow

celery_result_backend = db+postgresql://airflow:airflow@localhost:5432/airflow

This is still showing SQLIite?
WARNING - Because we cannot use more than 1 thread (parsing_processes = 2 ) when using sqlite. So we set parallelism to 1.

airflow db reset
DB: sqlite:////$HOME/airflow/airflow.db

similarly:
airflow db init
DB: sqlite:////$HOME/airflow/airflow.db
[2021-03-05 16:15:45,640] {db.py:674} INFO - Creating tables
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.

Why are these using sqllite when changed to postgres?