Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
AlexanderPowBI
Resolver I
Resolver I

Automate file movement lakehouse > onderive

Hello all,

 

I have a notebook that generates x amount of files which i save to the lakehouse.

I need to move this files (or create them) in a onedrive folder automatically.

 

I have tried with the Graph Api but that won't do it for me as it seems to only work with 0auth. Or is it possible to access programatically somehow without user intervention? (I need to set up this as a scheduled pipeline).

 

Any other ideas? From what I can see Pipelines does not have any feature to connect to onedrive either

1 ACCEPTED SOLUTION
AlexanderPowBI
Resolver I
Resolver I

Hi and thank you both for your replies. I checked some of this out and probably its possible, but i have found another way. Puting this out there if somone in the future needs to solve similar. 

 

Option 1 [best i found and will be using]: Creating a new user and it's own onedrive + refresh token to obtain access with graph API: 

* Created a specific user for my flow. "XXXXX@myCompanyName.com"

* Use this users onedrive to land files. Do not give it access to anything else. 

* Register an app in azure, and give the registred app delegated files rreadwrite access + admin consent, (so app can't access everyones folders, only user that is authenticated)

* Use manual flow to obtain a refresh token. (app.initiate_device_flow)

* Save the refresh token to a key vault. This can now be used to retrieven an access token when running notebook. 

* Use acquire_token_by_refresh_token method to get new access token and new refresh token. Update the existing key vault programtically once retrieved so you always have a new refresh token. 

* Do whatever you need to do in the API, like this case, save files to a one drive folder from lakehouse. 

https://learn.microsoft.com/en-us/python/api/msal/msal.application.publicclientapplication?view=msal...

Option 2: Set up a notebook and follow the "Get access without user" 

https://learn.microsoft.com/en-us/graph/auth-v2-service?tabs=http

By giving the registred app .readwrite.all permission on "Application permissions" (Not delegated), and granting it admin consent, it could read and write to onedrive. However, this means that one could access all folders for all users on tenant which I found to be a bad option.  

 

Option 3:

* Creae a specific user for your need flow. "XXXXX@myCompanyName.com" and restrict its one drive access. 

* Give the registred app delegated readwrite access + admin consent, (so app can't access everyones folders, only user that is authenticated)

* Used the (not very recommended) https://learn.microsoft.com/en-us/entra/identity-platform/v2-oauth-ropc flow to authenticate 

* Put password in keyvault for some added security.

 

Now the notebook can run from fabric, read files from lakehouse and insert into the onedrive folder. I found option one to be most secure. 

 

View solution in original post

6 REPLIES 6
frithjof_v
Super User
Super User

I created an idea in the Power Automate forum to have a Lakehouse connector, please vote:

 

 

 

https://ideas.powerautomate.com/d365community/idea/8bb4e21b-ba3f-ef11-b4ae-000d3a05ba58

Good idea, voted 🙂 

AlexanderPowBI
Resolver I
Resolver I

Hi and thank you both for your replies. I checked some of this out and probably its possible, but i have found another way. Puting this out there if somone in the future needs to solve similar. 

 

Option 1 [best i found and will be using]: Creating a new user and it's own onedrive + refresh token to obtain access with graph API: 

* Created a specific user for my flow. "XXXXX@myCompanyName.com"

* Use this users onedrive to land files. Do not give it access to anything else. 

* Register an app in azure, and give the registred app delegated files rreadwrite access + admin consent, (so app can't access everyones folders, only user that is authenticated)

* Use manual flow to obtain a refresh token. (app.initiate_device_flow)

* Save the refresh token to a key vault. This can now be used to retrieven an access token when running notebook. 

* Use acquire_token_by_refresh_token method to get new access token and new refresh token. Update the existing key vault programtically once retrieved so you always have a new refresh token. 

* Do whatever you need to do in the API, like this case, save files to a one drive folder from lakehouse. 

https://learn.microsoft.com/en-us/python/api/msal/msal.application.publicclientapplication?view=msal...

Option 2: Set up a notebook and follow the "Get access without user" 

https://learn.microsoft.com/en-us/graph/auth-v2-service?tabs=http

By giving the registred app .readwrite.all permission on "Application permissions" (Not delegated), and granting it admin consent, it could read and write to onedrive. However, this means that one could access all folders for all users on tenant which I found to be a bad option.  

 

Option 3:

* Creae a specific user for your need flow. "XXXXX@myCompanyName.com" and restrict its one drive access. 

* Give the registred app delegated readwrite access + admin consent, (so app can't access everyones folders, only user that is authenticated)

* Used the (not very recommended) https://learn.microsoft.com/en-us/entra/identity-platform/v2-oauth-ropc flow to authenticate 

* Put password in keyvault for some added security.

 

Now the notebook can run from fabric, read files from lakehouse and insert into the onedrive folder. I found option one to be most secure. 

 

Anonymous
Not applicable

Hi @AlexanderPowBI.

I'm glad to hear these suggestion helps and thanks for sharing the improved version workaround here, I think they will help others who have a similar requirement.

Regards,

Xiaoxin Sheng

Anonymous
Not applicable

HI @AlexanderPowBI,

Current it seems like not support to direct move data from fabric to OneDrive.

For this scenario, you can try to create pipeline to 'copy data' from fabric to azure blob storage, then you can use azure logic app to move these data to OneDrive side.

Reference link:

Configure Azure Blob Storage in a copy activity - Microsoft Fabric | Microsoft Learn

Transfer files from Azure Blob Storage to Onedrive folder - Microsoft Q&A

Regards,

Xiaoxin Sheng

frithjof_v
Super User
Super User

This Reddit thread (link below) is talking about moving files into Fabric Lakehouse using ADLS Gen2 API and Power Automate.

 

Perhaps opposite direction is also possible - using Power Automate to get the Lakehouse files with GET request, and then save the file in OneDrive.

 

https://www.reddit.com/r/MicrosoftFabric/comments/144qqsp/using_power_automate_to_put_files_directly...

 

https://www.linkedin.com/pulse/how-call-onelake-api-from-power-automate-enterprise-app-nigel-smith-4...

 

Here is some info about OneLake and ADLS Gen2 API https://learn.microsoft.com/en-us/fabric/onelake/onelake-api-parity

I have never used this API myself. Maybe it could work.

 

 

... or maybe the Notebook could save the files directly in OneDrive. I don't know, never tried.

 

... I never tried the File System connector in Data pipeline. Could that connector save files directly in a local OneDrive folder?

 

... Or maybe use Data pipeline to put the files in an Azure location and then use Power Automate to bring the files from Azure to OneDrive.

 

Just throwing out some ideas here 😅

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Fabric Update Carousel

Fabric Monthly Update - October 2025

Check out the October 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.