Hi,
I have Databricks on top of aws. I have a Databricks connection on Airflow (mwaa). I am able to connect and execute a Datbricks job via Airflow using a personal access token (I am using DatabricksRunOperator). I believe the best practice is to connect using a service principal. I understand that I should use the connection id and the secret in order to connect but I get error 401 which I believe it is a result of incorrect Oauth M2M.
In Airflow, when I create a Databricks connection, the Login value should be connection id and the value of password should be the Secret value of the service principal - am I correct?
Can someone share a light on how it should be done?
Thanks.