I am trying to set up the Astro CLI on our shared dev server. I succeeded in installing and starting (astro airflow start
). By default, the Airflow Webserver is hosted at http://localhost:8080/admin/
. I need to configure that so that it is accessible from my local machine. Is there a way to specify the IP or network interface that the webserver is bound to?
In your project directory, you should be able to run astro airflow init
to scaffold out a project, if you haven’t yet. Once you have those files created, you can edit the config file at ./.astro/config.yaml
and add:
webserver:
port: 8081
to switch the webserver port to 8081, for instance.
We don’t need to change the port we need to change the host. Ie, the webserver is hosted at http://localhost:8080/admin/
but I need access to it remotely so I need host
to be 0.0.0.0
, I think. I’ve tried putting
webserver:
host: 0.0.0.0
In my config.yaml
but that didn’t work. Also tried hostname
instead of host
.
Ah, my fault. A few of us were talking and somehow I thought we were only dealing with the port.
Under the hood the CLI is using docker-compose type functionality through a library called libcompose
. After some googling, it looks like docker-compose supports passing an IP through when you bind the port. I found this explaination - https://www.reddit.com/r/docker/comments/731cop/docker_compose_change_default_ip_from_0000/.
If libcompose
works the same way you might be able to change your port setting to be something like:
webserver:
port: 0.0.0.0:8081
For reference, that part of the CLI is here: https://github.com/astronomer/astro-cli/blob/master/airflow/include/composeyml.go#L81-L82
Unfortunately, that doesn’t work. That produces Airflow Webserver: http://localhost:0.0.0.0:8081/admin/
.
Ah, yea it looks like the CLI prints out a malformed URL because we have the localhost
part hard coded. We haven’t really considered this use yet, but we can patch that up.
Even though the output link is wrong, I was able to test this out here on our local network. I used the above config, and was able to access the webserver UI locally, as well as from another machine on the network.
You’re right. That does actually work. We were having unrelated issues connecting to our dev box that masked the fact that we fixed this configuration. Thank you very much for your guidance and independent testing. Patch on the stated webserver URL would be awesome.
It has been 4 year since the last reply but I was wondering the same question. I’m running astro dev start on remote server and it is starting on 127.0.0.1 and thus I don’t have access to the Airflow UI. Is there any way to configure config.yml so that it is hosted on 0.0.0.0 ? Thanks
Hi, is there any answer yet? or did you solve this issue?
Astro CLI is for local development only, and you need to be on the server/VM to be able to connect to the Airflow UI. Astro CLI uses the Local executor which is suitable to dev/testing purposes.
For testing you can also refer to this solution.
You can start a free trial version on Astro that allows you to access Airflow from the cloud and even chose between Celery vs Kubernetes Executor.
Thanks
Manmeet
expose_port: “true”
in config.yaml
I just solved it after some research.
astro config set -g airflow.expose_port true
astro config set -g airflow.expose_port true