Issue Summary
Confusion with Credential Configuration in Kedro 0.19 vs 0.18
Hello Kedro team,
I have encountered an issue regarding the configuration of credentials for accessing storage via abfss
in Kedro 0.19.3, which was not present in version 0.18. Here is a summary of the problem:
In Kedro 0.18, I configured the credentials for accessing storage through Spark configurations with Azure Service Principal, and everything worked fine. However, after upgrading to Kedro 0.19.3, the same setup stopped working. After spending a couple of days troubleshooting, I discovered that adding the credentials as environment variables resolved the issue.
My questions are:
hola @Carlos Prieto - Tomtom, thanks for the detailed explanation and sorry you had a bumpy experience. we're looking into this.
I have a few follow-up questions:
OmegaConfigLoader
, is that correct?Thanks for the quick response! Here are the details :
# Class that manages how configuration is loaded. from kedro.config import OmegaConfigLoader # noqa: E402 CONFIG_LOADER_CLASS = OmegaConfigLoader # Keyword arguments to pass to the CONFIG_LOADER_CLASS constructor. CONFIG_LOADER_ARGS = { "base_env": "base", "default_run_env": "local", "config_patterns": { "spark": ["spark*", "spark*/**"], } }
delta-spark==2.3.0 kedro==0.19.3 pyspark==3.3.2 azure-identity==1.12.0 azure-keyvault-secrets==4.7.0 pandas==1.5.3 country_converter==1.0.0 unidecode==1.3.6 haversine==2.8.0 rapidfuzz==3.1.2 numpy==1.23.1 azure-mgmt-network==25.2.0 azure-mgmt-compute==30.4.0 kedro-viz==8.0.1 kedro-datasets[spark-sparkdataset, spark-sparkjdbcdataset, pandas-csvdataset, pickle-pickledataset]==3.0.1 hdfs==2.7.3 s3fs==2024.3.1 postal==1.1.10 deltalake==0.16.3 opentraveldata==0.0.9.post2 fuzzywuzzy==0.18.0 python-Levenshtein==0.25.0 country-converter==1.0.0 babel==2.14.0 langchain==0.0.347 openai>=0.27.0 geopandas~=0.11.0 tiktoken==0.6.0 faiss-cpu==1.8.0