Hi Team, As a Fabric user working with Direct Lake semantic models (created via lakehouse SQL endpoints or OneLake), I've encountered significant challenges in backing up, copying, or migrating these models. The process is restricted and complicated compared to Import mode datasets: Backups require XMLA endpoints, Tabular Editor, or custom scripts (e.g., Semantic Link Labs), with limitations in free tiers or during live edits. Converting between Direct Lake modes (SQL vs. OneLake) involves manual TMDL edits in Power BI Desktop, risking errors. Recreating underlying lakehouse tables often leads to loss of complex measures (e.g., MTD, YTD, dynamic year-over-year comparisons), forcing full rebuilds, especially painful in production with 100+ reports. This results in high rework, time, and effort. Proposed Improvements: Built-in "Export/Import" or "Clone" button for semantic models in the Fabric UI, supporting Direct Lake modes without XMLA setup. Auto-backup or "snapshot" feature before lakehouse changes, preserving measures/relationships even if tables are recreated (e.g., decouple measures from table IDs). Enhanced version history with unlimited versions, easy restores, and integration for measures only. Simplified migration wizard between Direct Lake on SQL and OneLake, with schema validation. This would greatly ease management for enterprise users, reducing downtime and errors. Similar to Power BI's PBIX exports but optimized for Fabric's lake-centric architecture. Use Case: Production environments with dynamic reports relying on complex DAX measures over large Delta tables. Looking forward to community votes and Microsoft response!
... View more