Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
jochenj
Advocate II
Advocate II

Azure AS Migration Migration Experience from the Fields, Open Questions

we plan to migrate an existing Enterprise Reporting Solution to MS Fabric which is baed on Azure Analysis Services + around 250 Power BI Pro and ~300 AzureAS-XMLA-Endoint Users (3rd Party OLAP Tool).  I wanted to share my experience on this journey so far and hope that anyone can bring new ideas to make that planned migration a success.

We struggle to find the right architecture/setup to bring operational costs and added values and opportunies by new fabric features in a balance.  For the evaluated scenarios so far the costs with fabric would explode (would need to use at least F256 or maybe even F512 which would be basically 12 times more expensive then the current solution which is ~3500€/Month).

Situation:

  • Azure Analysis Services (AzureAS) hosting one Model with Import-Mode
  • daily scheduled load via on premises Gateway from SQL Server DB/DWH (~700GB but only fraction is loaded to AzureAS)
  • SSMS shows in DB properties ~40GB estimated Size of Model,  ~250 Tables and hundreds of DAX Measures in Model
  • processing mode (early morning for 2hours)
    • Pricing-Tier used = S8v2   (which is 200 GB Memory, 640QPUs)
    • Azure Metric for "Memory" show peaks up to 111GB  
  • normal reporting mode during day
    • Pricing-Tier used = S4 (which is 100GB Memory, 400QPUs)
    • Azure Metric for "Memory" shows peaks up to 79GB, baseline average ~50-60GB)

 

Challenges:

Memory in AzureAS world is very "cheap" or otherwise available Memory in Fabric SKUs is very rare/expensive:

jochenj_0-1713371474805.png

 

Scenarios we evaluated so far:

  1. Fabric Warehouse -> Shortcuts > Fabric Lakehouse > DirectLake > Power BI
    • Findings: Most promising but get "capacity limit" reached errors on report queris, maybe because no table-partitions?!
    • Is there any option to create Table-Partitions in Model when using DirectLake Mode?
  2. Fabric Warehouse > DirectQuery > Power BI
    • Findings: Works in some report-query cases really great,
    • but also many report-query cases where we run in timeouts (DirectQuery SQL Execution Plan shows nightmares when trying to translate DAX to SQL)
  3. Fabric Warehouse > ImportMode > Power BI
    • Even when creating small table permission we get "out of memory" on processing when using up to F256-SKU
    • For report-query loads we would also need at least F256

Constraints

  • Massive Re-Modelling the existing Model to increase performance is no option 

Any ideas/thougts are higly welcome !

 

 

1 ACCEPTED SOLUTION
v-yaningy-msft
Community Support
Community Support

Hi, @jochenj 

Based on the information you provided, you can refer to the following statement.


1. memory management, AzureAS uses a lot of memory to process and query data. If the memory limit is reached, you may need to consider vertically scaling the AzureAS instance or optimizing the model to use less memory. However, as you point out, memory in MS Fabric SKUs is more expensive. You may need to balance the cost of additional memory with the performance benefits it provides.
Facing memory issues with the Azure analysis services - Microsoft Q&A
Analysis Services memory properties | Microsoft Learn
2. In DirectLake mode, a "Capacity Limit" reach error was encountered. Whether or not you can create partitions in DirectLake mode in MS Fabric, you can refer to the following link.
vyaningymsft_0-1713423449777.png
A Comprehensive Guide to Direct Lake Semantic Model Migration for Power BI – Hitachi Solutions (hita...
3. DirectQuery mode sometimes causes a timeout. This may be due to the complexity of converting DAX to SQL. You may need to optimize your DAX queries or consider using a different query mode.
4. In ImportMode, even with small table permissions and F256-SKUs, an Out of Memory error is encountered. This indicates that your model may be too large for the available memory. You may want to consider ways to reduce the size of your model, such as removing unnecessary columns or rows, or using more efficient data types.
AAS: How can I optimize the memory usage of an Azure Analysis Services instance? - Stack Overflow
5. Regarding the cost of using MS Fabric, you may want to review the Azure Cost Optimization recommendations or consider using the Azure Cost Management tool to better understand and manage your costs.

Microsoft Cost Management | Microsoft Azure
Tutorial - Reduce Azure costs with recommendations - Microsoft Cost Management | Microsoft Learn
6. you mentioned that doing a large scale re-model of an existing model to improve performance is not an option. However, there are smaller optimizations you can make that could have a significant impact on performance. For example, you might consider optimizing memory settings in Analysis Services

Optimizing memory settings in Analysis Services - SQLBI

I hope this helps you in some way.

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

How to get your questions answered quickly --  How to provide sample data in the Power BI Forum

 

View solution in original post

1 REPLY 1
v-yaningy-msft
Community Support
Community Support

Hi, @jochenj 

Based on the information you provided, you can refer to the following statement.


1. memory management, AzureAS uses a lot of memory to process and query data. If the memory limit is reached, you may need to consider vertically scaling the AzureAS instance or optimizing the model to use less memory. However, as you point out, memory in MS Fabric SKUs is more expensive. You may need to balance the cost of additional memory with the performance benefits it provides.
Facing memory issues with the Azure analysis services - Microsoft Q&A
Analysis Services memory properties | Microsoft Learn
2. In DirectLake mode, a "Capacity Limit" reach error was encountered. Whether or not you can create partitions in DirectLake mode in MS Fabric, you can refer to the following link.
vyaningymsft_0-1713423449777.png
A Comprehensive Guide to Direct Lake Semantic Model Migration for Power BI – Hitachi Solutions (hita...
3. DirectQuery mode sometimes causes a timeout. This may be due to the complexity of converting DAX to SQL. You may need to optimize your DAX queries or consider using a different query mode.
4. In ImportMode, even with small table permissions and F256-SKUs, an Out of Memory error is encountered. This indicates that your model may be too large for the available memory. You may want to consider ways to reduce the size of your model, such as removing unnecessary columns or rows, or using more efficient data types.
AAS: How can I optimize the memory usage of an Azure Analysis Services instance? - Stack Overflow
5. Regarding the cost of using MS Fabric, you may want to review the Azure Cost Optimization recommendations or consider using the Azure Cost Management tool to better understand and manage your costs.

Microsoft Cost Management | Microsoft Azure
Tutorial - Reduce Azure costs with recommendations - Microsoft Cost Management | Microsoft Learn
6. you mentioned that doing a large scale re-model of an existing model to improve performance is not an option. However, there are smaller optimizations you can make that could have a significant impact on performance. For example, you might consider optimizing memory settings in Analysis Services

Optimizing memory settings in Analysis Services - SQLBI

I hope this helps you in some way.

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

How to get your questions answered quickly --  How to provide sample data in the Power BI Forum

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.