DatabricksSQLOperator ValueError "too many values to unpack"

I’m attempting to read data from Databricks using the DatabricksSQLOperator using the example code here:

My Code: deployed in Docker on localhost using Astro CLI

 select_into_file = DatabricksSqlOperator(
        sql="select * from schema.tokenization_input_1000",

When the results get to DatabricksSQLOperator.execute() method I’m getting a consistent error on line 164

[2022-11-22, 17:07:32 UTC] {} INFO - Executing: select * from schema.tokenization_input_1000
[2022-11-22, 17:07:32 UTC] {} INFO - Using connection ID 'tokenization_databricks' for task execution.
[2022-11-22, 17:07:32 UTC] {} INFO - Using token auth.
[2022-11-22, 17:07:33 UTC] {} INFO - Using token auth.
[2022-11-22, 17:07:34 UTC] {} INFO - Successfully opened session b'\x01\xedj\x88(\xb7\x10\xd7\xa2\x16\xf2\t\x0e\xb4\xd9\xe3'
[2022-11-22, 17:07:34 UTC] {} INFO - Running statement: select contributorID, datasetID, PatientID1 from schema.tokenization_input_1000, parameters: None
[2022-11-22, 17:07:37 UTC] {} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/airflow/providers/databricks/operators/", line 164, in execute
    schema, results = cast(List[Tuple[Any, Any]], response)[0]
ValueError: too many values to unpack (expected 2)
  • I’ve verified that my Databricks connection is defined correctly
  • I can see the queries being received in Databricks from “PyDatabricksSqlConnector 2.0.2”
  • I can see Databricks responding with the expected query results in Databricks.
  • I added “do_xcom_push=True” in the Operator call and I do not see data in xcom
  • I created a separate Python script using the databricks-sql-connector to double check the ability to read data to my local machine and it works correctly.
  • Databricks SQL Warehouse version is (v 2022.35)


FYI: This issue was fixed by upgrading the Databricks Airflow package version to v4.0.0, which released this week:

1 Like