Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Heena_9980400
Helper IV
Helper IV

Help needed/Tricks to handle huge data in power bi

Hi Experts,

 

Since I am dealing with 10 years of historical data(2011 to till date), and there is no logic, Dax and RLS used. There is a huge data, and to handle the data does composite mode will help in any way? Please respond to this and do the needful.

 

Best Regards,

K

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi,

 

Some things you can do:

 

  • User Large datasets format in Power bi
    • (This is a premium feature, but can improve the performance)
  • Only bring the columns that you will need
  • Normalize your model
  • Negotiate with dataset users a reasonable time window of data:
    • If the users just want to consume the most recent data, you can, for instance, load just the last year of data dynamicly
  • Pre-agregate the data:
    • Depending on your source, and the visuals you want to dysplay, you can create agregated tables in your source and just load the agregated values to power bi; For insntance if you have a facts table with 100 rows per day, and you have only a barchart displaying the monthly values, you can agregate the values by month in the source, and this way you will massively reduce the amount of data

 

Composite modeling can also help you, if your visuals show just fractions of the data. If you have to load them all, you will strugle as same as with import mode only.

You can try this ideas individualy or combine them all for better performance

 

Kind regards,
José
Please mark this answer as the solution if it resolves your issue.
Appreciate your kudos! 🙂

 

View solution in original post

1 REPLY 1
Anonymous
Not applicable

Hi,

 

Some things you can do:

 

  • User Large datasets format in Power bi
    • (This is a premium feature, but can improve the performance)
  • Only bring the columns that you will need
  • Normalize your model
  • Negotiate with dataset users a reasonable time window of data:
    • If the users just want to consume the most recent data, you can, for instance, load just the last year of data dynamicly
  • Pre-agregate the data:
    • Depending on your source, and the visuals you want to dysplay, you can create agregated tables in your source and just load the agregated values to power bi; For insntance if you have a facts table with 100 rows per day, and you have only a barchart displaying the monthly values, you can agregate the values by month in the source, and this way you will massively reduce the amount of data

 

Composite modeling can also help you, if your visuals show just fractions of the data. If you have to load them all, you will strugle as same as with import mode only.

You can try this ideas individualy or combine them all for better performance

 

Kind regards,
José
Please mark this answer as the solution if it resolves your issue.
Appreciate your kudos! 🙂

 

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors