Error using Pandas: `module 'pandas' has no attribute 'core'`

I’m trying to deploy a DAG that uses some Pandas code. When I deploy the code, I get the following error for each DAG:

Broken DAG: module 'pandas' has no attribute 'core'

I added pandas to requirements.txt and did a re-deploy, but still seeing the following:

➜  astronomer2 git:(master) ✗ astro airflow start
Sending build context to Docker daemon  293.9kB
Step 1/1 : FROM astronomerinc/ap-airflow:0.7.5-1.9.0-onbuild
# Executing 5 build triggers
 ---> Using cache
 ---> Using cache
 ---> Running in 65d45425ff73
Command "/usr/bin/python3.6 -m pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-h6bw_zss --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- wheel setuptools Cython "numpy==1.9.3; python_version=='3.5'" "numpy==1.12.1; python_version=='3.6'" "numpy==1.13.1; python_version>='3.7'"" failed with error code 1 in None
You are using pip version 18.1, however version 19.0.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
The command '/bin/sh -c pip install --no-cache-dir -q -r requirements.txt' returned a non-zero code: 1
Error: command 'docker build -t astronomer-2/airflow:latest failed: failed to execute cmd: exit status 1

We’ve seen this issue with more than a handful of folks recently - there’s effectively an issue with Alpine 3.8 and the version of Pandas we ship with that we’re getting fixed up in our next release (Astronomer v0.8). In the meantime, you’ll just need to specify a few OS-level packages in order for pandas to run in those containers.

Here’s the current fix:

  1. Pull Pandas out of requirements.txt if it’s in there (it’s installed by default)
  2. Add the following to your packages.txt
libgcrypt=1.8.3-r0
libxslt-dev
g++
libstdc++

If that doesn’t resolve the error on the first try, do an astro airflow stop and then astro airflow start and see if that does the trick.

If you’re still having trouble, reach out to support@astronomer.io and we’re happy to help.