Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I'm trying to run Maintenance on Lakehouse tables that are in OneLake. However, when I try and run Maintenance on the table with only the Optimize and V-Order flag selected I receive the following error:
I have even tried to run a Spark SQL notebook and the operation failed with a similar error. When I put the full URL in for the OneLake and workspace it ended up erroring again saying that the operation could not be completed on Dataverse tables.
Please let me know how this can be resolved.
Thanks,
Russ
Solved! Go to Solution.
Hi @rtolsma ,
I believe the issue is that the Dataverse data is not actually managed by Fabric, tables are pointers to your dataverse environment, the error while annoying is correct.
Proud to be a Super User!
try to download, onelale file explorer to see your onelake files
https://learn.microsoft.com/en-us/fabric/onelake/onelake-file-explorer
Proud to be a Super User!
I'm pretty sure external shortcuts don't consume OneLake storage.
Do you see any indications that it consumes OneLake storage?
Edit: Sorry, just became aware of your screenshot from the capacity metrics app.
That is very surprising.
Can there be some other delta tables in this workspace (not managed by Dataverse)?
E.g. are you running some notebooks, pipelines or dataflows which create a copy of some of the Dataverse tables inside Fabric?
All the tables are directly related to the Link to Fabric configuration. However, this data has only been exported since 7 days ago. Its quite possible that an optimize and vacuum job hasn't been run yet on Microsoft's underlying storage facility.
The Docs here, https://learn.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-view-in-fabric#d... say
When you link to Fabric from Power Apps, the system creates an optimized replica of your data in delta parquet format, the native format of Fabric and OneLake, using Dataverse storage such that your operational workloads aren't impacted. Is it possible that onelake was chosen as the destination?
Proud to be a Super User!
With Fabric Link (a.k.a. "Link to Fabric") the OneLake is the only option available which is always created when tables are exported from D365 Sales or from D365 Finance and Operations.
Really appreciate all the suggestions! I think I have a pretty good idea how to move forward.
You mention Dataverse tables.
Perhaps it is not possible to do Lakehouse maintenance on Dataverse tables?
(Perhaps those tables are fully managed by the Dataverse integration mechanism).
Perhaps it isn't, but I can't find any documentation to confirm this. If it is then I have to say that its quite frustrating to see this failing and experiencing this type of performance. I need to provide a solution or it almost becomes unusable.
Hi @rtolsma,
The error message notice this issue seems related to the permission. I'd like to suggest you check these settings if they meet to the error messages.
BTW, can you please share some more detail information about this issue? They should help us clarify your scenario and test to troubleshoot. Have you applied any change on the credentials that you used in the ADLS Gen2 connection?
Regards,
Xiaoxin Sheng
This is a OneLake lakehouse, the user is a Workspace admin and a Fabric SKU admin. I cannot find any other location to manage permissions and given that this is a Fabric operation when you click on the Maintenance menu item for the delta lake table it would seem to me that the permissions should not be an issue. If it is then the Maintenance menu should be disabled for a user who cannot perform the Optimize and Vacuum maintenance activities on a Lakehouse.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.