Connect to remote database airflow


Am new to airflow.I have to create a dag which will insert data to a table in remote database.Steps that has been done till now :.

  1. After installation ,ran cmd pip install 'apache-airflow[oracle] so that i can use oracle operator
  2. Created connection string from UI to connect to remote database
  3. Then I wrote my first dag but it didn’t seem to run.

Issues faced :

  1. Am not sure where to place my dag. Currently i have placed it in my home directory. This might be the reason it is not visible in UI. I tried to run it directly using unix using command : python {filename} .But i got error invalid syntax where i defined the sql.
    Below command does not work may be my dag is not visble in list_dags.

Airflow run dagname taskname 2020-1-20

  1. When i go to Data profiling and type any query against the connection ,I get error : TNS listener was not given SERVICE_NAME in CONNECT_DATA

Could somebody please help me resolving these issues.I believe i have missed something.may be i need cx_oracle and oracle client to be installed on the same machine as airflow?
Or some changes to be done to airflow.cfg file? ( To add the cfg file that i found was the one available in my home directory)

Code that i wrote is below :

from datetime import datetime,timedelta

import airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.oracle_operator import OracleOperator

default_args = {
‘owner’: ‘Airflow’,
‘depends_on_past’: False,
‘start_date’: datetime(2020,01,17),
‘email’: [‘’],
‘email_on_failure’: False,
‘email_on_retry’: False,
‘retries’: 0,
‘retry_delay’: timedelta(minutes=5)


with DAG(‘example_sql_dag’,
schedule_interval=’*/10 * * * *’
) as dag:
opr_sql = OracleOperator(task_id=‘task_sql’,
sql= ‘insert into table1 (a,b,c)values (1,2,3)’,
autocommit =‘True’)