Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.

Reply
rmcgrath
Helper III
Helper III

Too many rows?

Hello - I am trying to pull in a General Ledger Entries table via and OData URL from Business Central.  Unfortunately, it has 10,175,525 rows!  Power BI actually pulled it in, but when I tried to do transformations and then Close & Load, that is where it failed.  But I'm not sure if that is due to the source?

 

Any advice on how to handle a table of this size is GREATLY appreciated!

1 ACCEPTED SOLUTION
v-zhangtin-msft
Community Support
Community Support

Hi, @rmcgrath 

 

Here are a few strategies you can consider to manage the data more effectively:

 

DirectQuery Mode: If possible, consider using DirectQuery mode instead of importing the data. This allows Power BI to query the data directly from the source without loading it into memory, which can be beneficial for very large datasets. DirectQuery in Power BI - Power BI | Microsoft Learn

 

Incremental Refresh: Instead of loading the entire dataset at once, you can set up incremental refresh in Power BI. This allows you to load only new or changed data, which can significantly reduce the load time and improve performance. Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn

 

Dataflows: Use Power BI Dataflows to preprocess and transform the data before loading it into Power BI. This can help in offloading some of the transformation work and improve performance. Introduction to dataflows and self-service data prep - Power BI | Microsoft Learn

 

Best Regards,

Community Support Team _Charlotte

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

2 REPLIES 2
v-zhangtin-msft
Community Support
Community Support

Hi, @rmcgrath 

 

Here are a few strategies you can consider to manage the data more effectively:

 

DirectQuery Mode: If possible, consider using DirectQuery mode instead of importing the data. This allows Power BI to query the data directly from the source without loading it into memory, which can be beneficial for very large datasets. DirectQuery in Power BI - Power BI | Microsoft Learn

 

Incremental Refresh: Instead of loading the entire dataset at once, you can set up incremental refresh in Power BI. This allows you to load only new or changed data, which can significantly reduce the load time and improve performance. Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn

 

Dataflows: Use Power BI Dataflows to preprocess and transform the data before loading it into Power BI. This can help in offloading some of the transformation work and improve performance. Introduction to dataflows and self-service data prep - Power BI | Microsoft Learn

 

Best Regards,

Community Support Team _Charlotte

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

rajendraongole1
Super User
Super User

Hi @rmcgrath  - if you're pulling it via an OData feed.Use filtering and pagination to reduce the number of rows being pulled into Power BI.

Solved: Connecting to a large data table in Business Centr... - Microsoft Fabric Community

Using OData with Queries That are Set with a Top Number of Rows - Business Central | Microsoft Learn

 

 





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!





Helpful resources

Announcements
Feb2025 Sticker Challenge

Join our Community Sticker Challenge 2025

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Jan NL Carousel

Fabric Community Update - January 2025

Find out what's new and trending in the Fabric community.