Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hi Team,
As a Fabric user working with Direct Lake semantic models (created via lakehouse SQL endpoints or OneLake), I've encountered significant challenges in backing up, copying, or migrating these models. The process is restricted and complicated compared to Import mode datasets:
This results in high rework, time, and effort.
Proposed Improvements:
This would greatly ease management for enterprise users, reducing downtime and errors. Similar to Power BI's PBIX exports but optimized for Fabric's lake-centric architecture.
Use Case: Production environments with dynamic reports relying on complex DAX measures over large Delta tables.
Looking forward to community votes and Microsoft response!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.