Join the Kedro community

M
M
M
D
M

Pushing files to S3 with dynamic names

Hey team I want to push files to s3 but with dynamic names e.g appending a timestamp to the file just to store multiple copies of the file .

dummy_csv:
type: pandas.CSVDataset
filepath: s3://ml-datawarehouse/warehouse/test.csv
credentials: dev_s3

Right now, test.csv is being overwritten in S3. Is conf-resolver the answer to this question . I tried resolver like below , but no success yet -

dummy_csv:
type: pandas.CSVDataset
filepath: s3://ml-datawarehouse/warehouse/test_"{$today:}".csv
credentials: dev_s3


from datetime import date


CONFIG_LOADER_ARGS = {
    "custom_resolvers": {
        "today": lambda: date.today(),
    }
}

M
V
L
5 comments

Have you tried using Kedro dataset versioning?

not really, but I know that Versioning might use different timestamps and might help me. Thanks a lot.
But still any idea what is the issue here, like Why cannot i utilise the power of resolvers.

Any idea on this ?

the dollar sign should be outside of the curly braces

Thanks . It was such a silly one. πŸ™„

Add a reply
Sign up and join the conversation on Slack
Join