Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Prepping for a Fabric certification exam? Join us for a live prep session with exam experts to learn how to pass the exam. Register now.

Reply
Sinduja
Regular Visitor

Please someone clarify this question

You have a Fabric F32 capacity that contains a workspace. The workspace contains a warehouse named DW1 that
is modelled by using MD5 hash surrogate keys.
DW1 contains a single fact table that has grown from 200 million rows to 500 million rows during the past year.
You have Microsoft Power BI reports that are based on Direct Lake. The reports show year-over-year values.
Users report that the performance of some of the reports has degraded over time and some visuals show errors.
You need to resolve the performance issues. The solution must meet the following requirements:
Provide the best query performance.
Minimize operational costs.
Which should you do?
A.Change the MD5 hash to SHA256.
B.Increase the capacity.
C.Enable V-Order.
D.Modify the surrogate keys to use a different data type.
E.Create views.

2 ACCEPTED SOLUTIONS
lbendlin
Super User
Super User

None of the above. Run OPTIMIZE to consolidate the files into the desired 1GB chunks, and consider using yearly partitions.

 

(they want you to increase the capacity, of course)

View solution in original post

v-saisrao-msft
Community Support
Community Support

Hi  @Sinduja,

Thank you for reaching out to Microsoft Fabric Community.

 

Given the data growth and the performance issues you're seeing in Power BI Direct Lake reports, the best approach would be to enable V-Order. This optimizes columnar storage, improving query performance significantly without increasing costs.

Adding to what @lbendlin said, running OPTIMIZE in your Fabric Warehouse will consolidate small Delta files into efficient 1GB chunks, reducing the overhead on queries. Since your reports rely on year-over-year analysis, implementing yearly partitions can further improve performance by limiting the amount of scanned data.

 

Scaling up capacity F32 to higher is always an option, but before increasing costs, I would recommend enabling V-Order and optimizing storage. Let us if you any further queries.

 

If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it! 

View solution in original post

6 REPLIES 6
Tontaube2
Advocate I
Advocate I

Ok, Optimize would make sense...but to choose from the available options A-E:

is it:

B) Increase the capacity (only F64 can handle 500 million rows...). But: "Minimize operational costs."

C: Enable V-Order. Well, it's switched on by default - but since WHEN? (I.e., is it a possible "miss" during re-designing the exam?)

 

 

I think the answer is [B) Increase the capacity].  Direct Lake in Fabric F32 Capacity supports tables up to 300 million rows. (F64 can handle 500 million rows...).  It will increase the licensing cost while still aiming to minimize operational costs.

I think [B) Increase the capacity] is the answer.  It will increase the Licensing Cost and still aiming to "Minimize operational costs" 😊

yes, most likely.  Fabric features are changing at a fast pace, and the exams have a hard time keeping up and staying relevant.

v-saisrao-msft
Community Support
Community Support

Hi  @Sinduja,

Thank you for reaching out to Microsoft Fabric Community.

 

Given the data growth and the performance issues you're seeing in Power BI Direct Lake reports, the best approach would be to enable V-Order. This optimizes columnar storage, improving query performance significantly without increasing costs.

Adding to what @lbendlin said, running OPTIMIZE in your Fabric Warehouse will consolidate small Delta files into efficient 1GB chunks, reducing the overhead on queries. Since your reports rely on year-over-year analysis, implementing yearly partitions can further improve performance by limiting the amount of scanned data.

 

Scaling up capacity F32 to higher is always an option, but before increasing costs, I would recommend enabling V-Order and optimizing storage. Let us if you any further queries.

 

If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it! 

lbendlin
Super User
Super User

None of the above. Run OPTIMIZE to consolidate the files into the desired 1GB chunks, and consider using yearly partitions.

 

(they want you to increase the capacity, of course)

Helpful resources

Announcements
Notebook Gallery Carousel1

NEW! Community Notebooks Gallery

Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.

April2025 Carousel

Fabric Community Update - April 2025

Find out what's new and trending in the Fabric community.