Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Subject: Urgent Help Needed: Automating File Transfer from SharePoint to Data Lakehouse Using Fabrics
Hi everyone,
I am trying to automate a process where a new file in a SharePoint folder, which is replaced every day with the same name, should be pushed to a Data Lakehouse using Fabrics. The file's arrival time is not specific, so I cannot use an event-based trigger. Therefore, I was thinking of using a notebook to trigger the process when a new file arrives in SharePoint.
I have used GenFlow 2 to get the data, set the destination as Lakehouse, and selected "append data."
Can anyone please help me with the right approach? I am new to Fabrics and data engineering concepts, and your guidance will be greatly appreciated.
Thank you!
Solved! Go to Solution.
Hi @manishkumar13,
Thanks for reaching out to the Microsoft Fabric Community Forum.
After thoroughly reviewing the details you provided, here are a few alternative workarounds that might help resolve the issue. Please follow the steps below:
Please go through the below following links for more information:
Options to get data into the Lakehouse - Microsoft Fabric | Microsoft Learn
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Best Regards.
Hi @manishkumar13,
Thanks for reaching out to the Microsoft Fabric Community Forum.
After thoroughly reviewing the details you provided, here are a few alternative workarounds that might help resolve the issue. Please follow the steps below:
Please go through the below following links for more information:
Options to get data into the Lakehouse - Microsoft Fabric | Microsoft Learn
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Best Regards.
Thanks for your reply.
I will try this solution and see if it is working or not.
Regards,
Manish Kumar
Hi @manishkumar13,
Thank you for providing the update on the issue. If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Best Regards.
This here is a forum where users help users, time permitting. For urgent requests contact a Microsoft partner near you.
Sorry for using the word "urgent." This is my first time using the forum, and I noticed other posts using it. Moving forward, I will not use it.
Subject: Urgent Help Needed: Automating File Transfer from SharePoint to Data Lakehouse Using Fabrics
Hi everyone,
I am trying to automate a process where a new file in a SharePoint folder, which is replaced every day with the same name, should be pushed to a Data Lakehouse using Fabrics. The file's arrival time is not specific, so I cannot use an event-based trigger. Therefore, I was thinking of using a notebook to trigger the process when a new file arrives in SharePoint.
I have used GenFlow 2 to get the data, set the destination as Lakehouse, and selected "append data."
Can anyone please help me with the right approach? I am new to Fabrics and data engineering concepts, and your guidance will be greatly appreciated.
Thank you!