Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Dear Forum,
I have coded some transformation and processing of CSV files in a notebook. The next step that I would like to add is to automatically process files in the notebook from a lakehouse storage whenever a new CSV file is added to a folder in the lakehouse.
I saw that there is an option for triggers in combination with notebooks with the ability to pass the file name to the notebook as a variable, but unfortunately it seems that this is only supported for Azure blob storage and not lakehouses.
The only workaround I see to automatically process new files is to schedule the notebook and implement this behavior in python itself, by checking if a new file was added since the last run.
Since I am new to Fabric, I wanted to ask if there is a more elegant way to implement this in Fabric. Is there maybe an option in Pipelines/Dataflows that I missed?
Thank you very much in advance.
Best wishes,
Morris
Solved! Go to Solution.
Based on my understanding, as of now the event triggers within the data pipelines are w.r.t Blob storage only and not the lakehouse.
As of now there is no way to have event driven framework with lakehouse as the source
Based on my understanding, as of now the event triggers within the data pipelines are w.r.t Blob storage only and not the lakehouse.
As of now there is no way to have event driven framework with lakehouse as the source
Thank you for your answer. Okay, then it seems I will have to use the workaround with scheduling.