Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
Hi community,
I already had a question which I marked as "answer" - but I need more information (https://community.fabric.microsoft.com/t5/Fabric-platform/Incremental-load-of-onedrive-files-into-fa...)
I have a customer who works with csv files, which are imported to power bi as of today. I would love to have a datastore in the middle, like a fabric lakehouse (or datalake gen 2, or blob storage), where I can store all csv-files. The goal is to have all data in a structured manner in a lakehouse or database.
The recommendation in the aswer in the link above was to use pipelines, which are capable of monitoring folders and start automatically, when new files arrive. But I could not see any trigger, which observers an OneDrive folder. So I was thinking about using a datalake gen2/blog storage (which can be monitored) - but I couldn't find a way to transfer the csv files automatically (when they arrive) to the data storage. There is a way using "Power automate" but with a premium license only - is this this the only way to transfer files automatically to a azure storage...?
I really have difficulties to understand the "correct" way, to transfer files to a datalake, blob storage, lakehouse in an automated manner. (!Not just as a one time job, but automated for all new files in a folder!)
Especially when it comes to OneDrive because this is a technology, most of my customers are able to use.
Thanks and sorry, if this is a stupid question
Holger
Solved! Go to Solution.
Hi @holgergubbels,
Thank you for reaching out in Microsoft Community Forum.
Please follow below approaches to transfer files from a OneDrive folder to a Lakehouse or Data Lake
1. Use tools like Power Automate (requires a premium license) to monitor OneDrive folders and automatically transfer new CSV files to a Data Lake Gen2, Blob Storage, or Lakehouse.
2. please use Azure Data Factory or Synapse Pipelines for automation without depending on Power Automate. These tools can monitor and process new files using HTTP connectors or integration with OneDrive.
Please continue using Microsoft community forum.
If you found this post helpful, please consider marking it as "Accept as Solution" and select "Yes" if it was helpful. help other members find it more easily.
Regards,
Pavan.
Thanks for your answer. Can you tell me how to find a "OneDrive" Connector in Azure Data Factory...? There is no OneDrive Connector in ADF... And there is no connector in any technology in Fabric as well - as far as I understood! For sure I can use Power Automate. But I need to have a premium license. Just for transfer some files into a lakehouse...?
As far as I can see it, there is nowhere a connector, to load files from onedrive to anywhere... For sure, you can use a dataflow gen2 - but using that I loose the file. The dataflow reads the data and I can append all data from each while and store it in a table or parquet file. But no chance to keep the original file or to copy it somewhere else, after reading the file...
Or did I miss something?
Hello @holgergubbels
Have you tried this : https://zappysys.com/api/integration-hub/onedrive-connector/azure-data-factory
no, because I do not want to load data from a cloud storage to an on premise system to load it back to a cloud storage... I am wondering why there is no solution inside fabric to solve the problem... I think it is a very common scenario that there are files in a onedrive folder which habe to be transferred to a lakehouse/datalake etc...
Hi @holgergubbels,
Thank you for reaching out in Microsoft Community Forum.
Please follow below approaches to transfer files from a OneDrive folder to a Lakehouse or Data Lake
1. Use tools like Power Automate (requires a premium license) to monitor OneDrive folders and automatically transfer new CSV files to a Data Lake Gen2, Blob Storage, or Lakehouse.
2. please use Azure Data Factory or Synapse Pipelines for automation without depending on Power Automate. These tools can monitor and process new files using HTTP connectors or integration with OneDrive.
Please continue using Microsoft community forum.
If you found this post helpful, please consider marking it as "Accept as Solution" and select "Yes" if it was helpful. help other members find it more easily.
Regards,
Pavan.
Thanks @v-pbandela-msft
But there is no "OneDrive Connector" in ADF... Which "Integration" do you mean...? How would you monitor and import files in ADF?
Regards
Holger
Hi @holgergubbels,
Thank you for reaching out in Microsoft Community Forum.
You’re correct that Azure Data Factory (ADF) doesn’t have a native OneDrive connector.
However, you can use these methods to monitor and import files from OneDrive:
1. Use ADF’s Web or Copy Activity with the OneDrive REST API to pull files into Azure Data Lake or Blob Storage.
2. Use Power Automate to monitor OneDrive and trigger an ADF pipeline to move files once they arrive.
3. Sync OneDrive to a local folder and use ADF’s Blob Storage connectors for the transfer.
Please continue using Microsoft community forum.
If you found this post helpful, please consider marking it as "Accept as Solution" and select "Yes" if it was helpful. help other members find it more easily.
Regards,
Pavan.
Power Automate is a straightforward tool for automating file transfers from OneDrive to Azure storage or a Lakehouse.
Use the "When a file is created" trigger in Power Automate to monitor a specific OneDrive folder for new files
One of the limitatiom will be Power Automate has a 50 MB limit for OneDrive triggers.
Azure Data Factory is more scalable and suitable for larger datasets.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Check out the March 2025 Fabric update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
27 | |
14 | |
9 | |
8 | |
4 |
User | Count |
---|---|
32 | |
26 | |
23 | |
20 | |
6 |