Hi all,
In one DAG I have 15 parallel streams of Spark jobs submissions using Livy Opearator and then - Livy Monitor.
I want to be sure that no more than 5 Spark jobs are simultaneusly running from this DAG.
I have tried to use concurrency = 5 in the DAG configuration, but because monitoring of the Spark job execution is in a separate operator it didn’t work. It first submitted 5 jobs, waited for confirmation and then completed the rest (and that’s ok :)).
Could you please advice me how this can be resolved?
Thanks,
Dmytro Boiko