Using bind mounts when developing locally

Hi,

I have a specific usecase where I want to develop a Python package locally that I want to be able to work well with astronomer. I am also working on Windows. I’d like to install the package with the -e flag.

i.e. I have a Python package living in C:\dev\my_package. I have an astronomer project in C:\dev\astro. My initial thought would be to just have this line in my C:\dev\astro\requirements.txt: -e ..\\my_package. This obviously didn’t work. Then I thought I might add a docker-compose.override.yml like so to bind mount the files to the container:

version: "2"
services:
  webserver:
    volumes:
      - C:\dev\my_package:/usr/local/airflow/my_package:ro
  scheduler:
    volumes:
      - C:\dev\my_package:/usr/local/airflow/my_package:ro

and then have my requirements.txt like so to install the package:

-e /usr/local/airflow/my_package

But this also does not work, giving the same error:

ERROR: /usr/local/airflow/my_package is not a valid editable requirement. It should either be a path to a local project or a VCS URL (beginning with svn+, git+, hg+, or bzr+).

Is there a good way to achieve my desired result of developing a Python package for use in astronomer locally?

I think the error is confusing here, and the problem is actually with the “C:” part – try changing \ to / in the windows sides of the path and see if that helps?

(I think the \d is getting read by the YAML parser before Docker sees it, so by the time docker gets it, the value is C:devmy_package:/usr/local/airflow/my_package:ro!)

Have you considered building the package into an sdist or wheel and installing it that way? I’ve successfully deployed my own Python packages to a local Airflow instance using that process.

First, build the Python packge as a wheel or tar.gz (see https://packaging.python.org/tutorials/packaging-projects/). Then, either copy the whl/tar.gz to your running scheduler container and pip install it or pip install it via requirements.txt or via a custom Dockerfile. I’m using a custom Dockerfile since I have a lot of other requirements.

If you want to give that a try and run into issues I’m happy to answer questions.

Thanks, you too!

I was able to identify the true error by inspecting the running container and realizing that

  1. the package was successfully imported to the containers, and
  2. pip install -e mypackage fails due to error: [Errno 30] Read-only file system -> unfortunately, I can’t develop a file locally and see changed directly in a running local astronomer session for this reason: https://github.com/pypa/pip/issues/3930

Would have been too good - but I suppose I can stop and restart whenever I want to check out a code change… :slight_smile: