Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
rtolsma
Helper I
Helper I

Maintenance on Lakehouse Tables

Hi,

 

I'm trying to run Maintenance on Lakehouse tables that are in OneLake.  However, when I try and run Maintenance on the table with only the Optimize and V-Order flag selected I receive the following error:

 

2024-10-29_15-39-48.png

 

I have even tried to run a Spark SQL notebook and the operation failed with a similar error.   When I put the full URL in for the OneLake and workspace it ended up erroring again saying that the operation could not be completed on Dataverse tables.

 

Please let me know how this can be resolved.

 

Thanks,

 

Russ

1 ACCEPTED SOLUTION

Hi @rtolsma ,

I believe the issue is that the Dataverse data is not actually managed by Fabric, tables are pointers to your dataverse environment, the error while annoying is correct.

richbenmintz_0-1730302299505.png

 



I hope this helps,
Richard

Did I answer your question? Mark my post as a solution! Kudos Appreciated!

Proud to be a Super User!


View solution in original post

29 REPLIES 29

try to download, onelale file explorer to see your onelake files

https://learn.microsoft.com/en-us/fabric/onelake/onelake-file-explorer



I hope this helps,
Richard

Did I answer your question? Mark my post as a solution! Kudos Appreciated!

Proud to be a Super User!


I'm pretty sure external shortcuts don't consume OneLake storage.

 

Do you see any indications that it consumes OneLake storage?

 

Edit: Sorry, just became aware of your screenshot from the capacity metrics app.

 

That is very surprising.

 

Can there be some other delta tables in this workspace (not managed by Dataverse)?

 

E.g. are you running some notebooks, pipelines or dataflows which create a copy of some of the Dataverse tables inside Fabric? 

All the tables are directly related to the Link to Fabric configuration.  However, this data has only been exported since 7 days ago.  Its quite possible that an optimize and vacuum job hasn't been run yet on Microsoft's underlying storage facility.

The Docs here, https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-view-in-fabric#d... say

When you link to Fabric from Power Apps, the system creates an optimized replica of your data in delta parquet format, the native format of Fabric and OneLake, using Dataverse storage such that your operational workloads aren't impacted. Is it possible that onelake was chosen as the destination?



I hope this helps,
Richard

Did I answer your question? Mark my post as a solution! Kudos Appreciated!

Proud to be a Super User!


With Fabric Link (a.k.a. "Link to Fabric") the OneLake is the only option available which is always created when tables are exported from D365 Sales or from D365 Finance and Operations.

 

Really appreciate all the suggestions!  I think I have a pretty good idea how to move forward.

frithjof_v
Super User
Super User

You mention Dataverse tables.

 

Perhaps it is not possible to do Lakehouse maintenance on Dataverse tables?

 

(Perhaps those tables are fully managed by the Dataverse integration mechanism).

Perhaps it isn't, but I can't find any documentation to confirm this.  If it is then I have to say that its quite frustrating to see this failing and experiencing this type of performance.  I need to provide a solution or it almost becomes unusable.

Anonymous
Not applicable

Hi @rtolsma,

The error message notice this issue seems related to the permission. I'd like to suggest you check these settings if they meet to the error messages.

BTW, can you please share some more detail information about this issue? They should help us clarify your scenario and test to troubleshoot. Have you applied any change on the credentials that you used in the ADLS Gen2 connection?

Regards,

Xiaoxin Sheng

This is a OneLake lakehouse, the user is a Workspace admin and a Fabric SKU admin.   I cannot find any other location to manage permissions and given that this is a Fabric operation when you click on the Maintenance menu item for the delta lake table it would seem to me that the permissions should not be an issue.  If it is then the Maintenance menu should be disabled for a user who cannot perform the Optimize and Vacuum maintenance activities on a Lakehouse.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.