Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!

Reply
rmcgrath
Advocate II
Advocate II

Too many rows?

Hello - I am trying to pull in a General Ledger Entries table via and OData URL from Business Central.  Unfortunately, it has 10,175,525 rows!  Power BI actually pulled it in, but when I tried to do transformations and then Close & Load, that is where it failed.  But I'm not sure if that is due to the source?

 

Any advice on how to handle a table of this size is GREATLY appreciated!

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi, @rmcgrath 

 

Here are a few strategies you can consider to manage the data more effectively:

 

DirectQuery Mode: If possible, consider using DirectQuery mode instead of importing the data. This allows Power BI to query the data directly from the source without loading it into memory, which can be beneficial for very large datasets. DirectQuery in Power BI - Power BI | Microsoft Learn

 

Incremental Refresh: Instead of loading the entire dataset at once, you can set up incremental refresh in Power BI. This allows you to load only new or changed data, which can significantly reduce the load time and improve performance. Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn

 

Dataflows: Use Power BI Dataflows to preprocess and transform the data before loading it into Power BI. This can help in offloading some of the transformation work and improve performance. Introduction to dataflows and self-service data prep - Power BI | Microsoft Learn

 

Best Regards,

Community Support Team _Charlotte

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Hi, @rmcgrath 

 

Here are a few strategies you can consider to manage the data more effectively:

 

DirectQuery Mode: If possible, consider using DirectQuery mode instead of importing the data. This allows Power BI to query the data directly from the source without loading it into memory, which can be beneficial for very large datasets. DirectQuery in Power BI - Power BI | Microsoft Learn

 

Incremental Refresh: Instead of loading the entire dataset at once, you can set up incremental refresh in Power BI. This allows you to load only new or changed data, which can significantly reduce the load time and improve performance. Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn

 

Dataflows: Use Power BI Dataflows to preprocess and transform the data before loading it into Power BI. This can help in offloading some of the transformation work and improve performance. Introduction to dataflows and self-service data prep - Power BI | Microsoft Learn

 

Best Regards,

Community Support Team _Charlotte

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

rajendraongole1
Super User
Super User

Hi @rmcgrath  - if you're pulling it via an OData feed.Use filtering and pagination to reduce the number of rows being pulled into Power BI.

Solved: Connecting to a large data table in Business Centr... - Microsoft Fabric Community

Using OData with Queries That are Set with a Top Number of Rows - Business Central | Microsoft Learn

 

 





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!





Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! It's time to submit your entry.

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.