How I can use Astro in SSH restricted environment?

How to manage simple tasks on remote hosts, if SSH is prohibited? Hence, SSH Operators are useless.
For example, I want to run some shell script on remote host. When script is completed, move on with other DAG tasks.
I can use http or mq, but I need some type of Airflow/Astro agent to start the script and monitor its status?
Is there such agent?

Hey @VeslavG

There is no such agent but there is definitely Airflow Sensors. If as part of the external process, you can create a pid file (standard *nix type process) and keep checking for it to wait for the process. At the end of your external process, you can get rid of the file and the sensor will fail.

The following task can be executed on FAILED status of sensor.

Let me know your thoughts.


Hi Manmeet,
As I understand, this approach still requires SSH into the remote host, if to implement “as is”.
An alternative would be to create this temp file on some shared storage, where sensor would get access.
Yet another alternative would be to create a small http API server on that host, which can be poked by http sensors. But with this approach we’re back to the idea of agent.

But the major issue with that approach, that before checking status via sensors, you need to start the process somehow, using operators.
And we’re back to “square one”.


May be I understood incorrectly, I thought you mentioned that you have access via HTTP to the server. Isn’t that the case ?

You are correct about another location to track the pid. So there are two options:

  1. Use a combination of HTTPOperator and HTTPSensor (if there is an API endpoint to track the status of a request)
  2. If there is no HTTP endpoint to check the status, then use HTTPOperator to trigger the job if using HTTP is an option and then use an SFTP Server to track the status of a file used as a pid
  3. Flip the execution flow, by posting a file on an SFTP server, that the external process tracks and triggers the process and after completion puts a complete file.

We would need some sort of access, or an indirect way to communicate with that external process.


Sorry, I’m new to Airflow and maybe I’m missing something.
How can I trigger some job on remote host via HTTP Operator?
I need some custom HTTP Server on that host?
My hosts are just Java or C# application server. - again, everything there is under assumption is that I have a very well designed API server.