Hi
Am new to airflow.I have to create a dag which will insert data to a table in remote database.Steps that has been done till now :.
- After installation ,ran cmd pip install 'apache-airflow[oracle] so that i can use oracle operator
- Created connection string from UI to connect to remote database
- Then I wrote my first dag but it didn’t seem to run.
Issues faced :
- Am not sure where to place my dag. Currently i have placed it in my home directory. This might be the reason it is not visible in UI. I tried to run it directly using unix using command : python {filename} .But i got error invalid syntax where i defined the sql.
Below command does not work may be my dag is not visble in list_dags.
Airflow run dagname taskname 2020-1-20
- When i go to Data profiling and type any query against the connection ,I get error : TNS listener was not given SERVICE_NAME in CONNECT_DATA
Could somebody please help me resolving these issues.I believe i have missed something.may be i need cx_oracle and oracle client to be installed on the same machine as airflow?
Or some changes to be done to airflow.cfg file? ( To add the cfg file that i found was the one available in my home directory)
Code that i wrote is below :
from datetime import datetime,timedelta
import airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.oracle_operator import OracleOperator
default_args = {
‘owner’: ‘Airflow’,
‘depends_on_past’: False,
‘start_date’: datetime(2020,01,17),
‘email’: [‘airflow@example.com’],
‘email_on_failure’: False,
‘email_on_retry’: False,
‘retries’: 0,
‘retry_delay’: timedelta(minutes=5)
}
with DAG(‘example_sql_dag’,
default_args=default_args,
catchup=False,
schedule_interval=’*/10 * * * *’
) as dag:
opr_sql = OracleOperator(task_id=‘task_sql’,
oracle_conn_id=‘Oracle_schema’
sql= ‘insert into table1 (a,b,c)values (1,2,3)’,
autocommit =‘True’)