Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
manishkumar13
Frequent Visitor

Urgent Help Needed: Automating File Transfer from SharePoint

Subject:  Urgent Help Needed: Automating File Transfer from SharePoint to Data Lakehouse Using Fabrics

Hi everyone,

I am trying to automate a process where a new file in a SharePoint folder, which is replaced every day with the same name, should be pushed to a Data Lakehouse using Fabrics. The file's arrival time is not specific, so I cannot use an event-based trigger. Therefore, I was thinking of using a notebook to trigger the process when a new file arrives in SharePoint.

I have used GenFlow 2 to get the data, set the destination as Lakehouse, and selected "append data."

Can anyone please help me with the right approach? I am new to Fabrics and data engineering concepts, and your guidance will be greatly appreciated.

Thank you!

1 ACCEPTED SOLUTION
v-kpoloju-msft
Community Support
Community Support

Hi @manishkumar13,
Thanks for reaching out to the Microsoft Fabric Community Forum.

 

After thoroughly reviewing the details you provided, here are a few alternative workarounds that might help resolve the issue. Please follow the steps below:

 

  • Ensure your connection to the SharePoint folder is correctly configured by checking the URL, credentials, and permissions. Confirm that the file is being updated daily. You can manually verify the file's presence and updates in the folder.
  • Make sure your notebook is set up to monitor the SharePoint folder using a loop or a scheduled task to periodically check for new files.
  • Double-check your Geneflow 2 settings. Ensure the source is set to the SharePoint folder and the destination is set to the Lakehouse, with the "append data" option selected.
  • Check the error logs for any issues during the file transfer. Look for specific error messages and validate that the data format and structure are consistent and compatible with the Lakehouse.
  • Ensure there are no network issues interrupting the file transfer. Verify your network connection and firewall settings.
  • Review the automation logs for any failures or issues, as they provide detailed information about each operation's status and any errors encountered.
  • Check for any throttling or connection limits imposed by SharePoint or the Lakehouse. Test the process with a smaller sample file to determine if the issue is related to file size or content.

Please go through the below following links for more information:
Options to get data into the Lakehouse - Microsoft Fabric | Microsoft Learn

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

 

Best Regards.

View solution in original post

6 REPLIES 6
v-kpoloju-msft
Community Support
Community Support

Hi @manishkumar13,
Thanks for reaching out to the Microsoft Fabric Community Forum.

 

After thoroughly reviewing the details you provided, here are a few alternative workarounds that might help resolve the issue. Please follow the steps below:

 

  • Ensure your connection to the SharePoint folder is correctly configured by checking the URL, credentials, and permissions. Confirm that the file is being updated daily. You can manually verify the file's presence and updates in the folder.
  • Make sure your notebook is set up to monitor the SharePoint folder using a loop or a scheduled task to periodically check for new files.
  • Double-check your Geneflow 2 settings. Ensure the source is set to the SharePoint folder and the destination is set to the Lakehouse, with the "append data" option selected.
  • Check the error logs for any issues during the file transfer. Look for specific error messages and validate that the data format and structure are consistent and compatible with the Lakehouse.
  • Ensure there are no network issues interrupting the file transfer. Verify your network connection and firewall settings.
  • Review the automation logs for any failures or issues, as they provide detailed information about each operation's status and any errors encountered.
  • Check for any throttling or connection limits imposed by SharePoint or the Lakehouse. Test the process with a smaller sample file to determine if the issue is related to file size or content.

Please go through the below following links for more information:
Options to get data into the Lakehouse - Microsoft Fabric | Microsoft Learn

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

 

Best Regards.

Thanks for your reply.

I will try this solution and see if it is working or not.

 

Regards,

Manish Kumar

Hi @manishkumar13,

 

Thank you for providing the update on the issue. If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

 

Best Regards.

lbendlin
Super User
Super User


This here is a forum where users help users, time permitting.  For urgent requests contact a Microsoft partner near you.

 

Sorry for using the word "urgent." This is my first time using the forum, and I noticed other posts using it. Moving forward, I will not use it.

manishkumar13
Frequent Visitor

Subject: Urgent Help Needed: Automating File Transfer from SharePoint to Data Lakehouse Using Fabrics

Hi everyone,

I am trying to automate a process where a new file in a SharePoint folder, which is replaced every day with the same name, should be pushed to a Data Lakehouse using Fabrics. The file's arrival time is not specific, so I cannot use an event-based trigger. Therefore, I was thinking of using a notebook to trigger the process when a new file arrives in SharePoint.

I have used GenFlow 2 to get the data, set the destination as Lakehouse, and selected "append data."

Can anyone please help me with the right approach? I am new to Fabrics and data engineering concepts, and your guidance will be greatly appreciated.

Thank you!

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.