The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
I am testing Fabric in last month and i really like direct lake connection.
This made me think if we use direct lake and no actual data is stored in semantic model how does this reflect to choosen storage format for that workspace.
On workspace level i can choose between "Small semantic model storage format" and "Large semantic model storage format" so if i am only going to use direct lake connection do i have some benefits or constraints of choosing one over another?
Solved! Go to Solution.
Hi @morz3d
Regarding the choice between "Small semantic model storage format" and "Large semantic model storage format" at the workspace level, this decision primarily affects how data is stored and managed within Fabric's semantic models themselves, rather than directly influencing the performance or capabilities of direct lake queries.
Since direct lake connections bypass the semantic model for querying, the choice of storage format for the semantic model doesn't directly affect the performance of these queries. Your decision should therefore be guided by how you plan to use the semantic models alongside direct lake queries.If your strategy leans heavily towards leveraging direct lake queries with minimal reliance on semantic models, the "Small semantic model storage format" might be more cost-effective. Conversely, if you anticipate needing to perform complex operations within semantic models for a subset of your analytics needs, the "Large semantic model storage format" might offer the necessary performance benefits.
Best Regards,
Jayleny
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @morz3d
Regarding the choice between "Small semantic model storage format" and "Large semantic model storage format" at the workspace level, this decision primarily affects how data is stored and managed within Fabric's semantic models themselves, rather than directly influencing the performance or capabilities of direct lake queries.
Since direct lake connections bypass the semantic model for querying, the choice of storage format for the semantic model doesn't directly affect the performance of these queries. Your decision should therefore be guided by how you plan to use the semantic models alongside direct lake queries.If your strategy leans heavily towards leveraging direct lake queries with minimal reliance on semantic models, the "Small semantic model storage format" might be more cost-effective. Conversely, if you anticipate needing to perform complex operations within semantic models for a subset of your analytics needs, the "Large semantic model storage format" might offer the necessary performance benefits.
Best Regards,
Jayleny
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.