Snowflake connector

Hi all,

I am trying to install the Snowflake connector on Cloud and I’m running into issues. If I try to install using requirements.txt I get:

Command "/usr/bin/python3.6 -u -c "import setuptools, tokenize;__file__='/tmp/pip-install-rvb2jufq/pycryptodomex/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-record-pjac4a8a/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-install-rvb2jufq/pycryptodomex/

I was able to get some traction by doing “pip install --user snowflake-connector-python” and discovering that gcc isn’t installed. After some time down the rabbit hole and adding a fair number of packages, I have eventually ended up at a place where I have to custom build something (Apache Arrow, for pyarrow).

So finally my question- I have seen the Snowflake connector in other folks’ pip freezes in these forums. Is this something that used to be included but no longer is, or does anyone have the magic incantation to make this work?

I was able to install the Snowflake connector by doing the following:

packages.txt:
gcc
build-base

… and pinning the version of the connector to one before the introduction of Arrow. asn1crypto also needed to be pinned to an earlier version for compatibility:

requirements.txt:
snowflake-connector-python==1.7.11
asn1crypto==0.24.0

I just added snowflake-connector-python to my requirements.txt, ran astro airflow start and it came up without any issues. I used the following image in my dockerfile. astronomerinc/ap-airflow:0.10.2-1.10.5-onbuild

Can you create a test folder, run astro init, update the Dockerfile and update requirements.txt and run astro airlfow start.

@AndrewHarmon Yep, using the new image worked for me, though there is a dependency conflict:

ERROR: azure-mgmt-nspkg 3.0.2 has requirement azure-nspkg>=3.0.0, but you’ll have azure-nspkg 2.0.0 which is incompatible.