Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowThe Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.
Hello - I am trying to pull in a General Ledger Entries table via and OData URL from Business Central. Unfortunately, it has 10,175,525 rows! Power BI actually pulled it in, but when I tried to do transformations and then Close & Load, that is where it failed. But I'm not sure if that is due to the source?
Any advice on how to handle a table of this size is GREATLY appreciated!
Solved! Go to Solution.
Hi, @rmcgrath
Here are a few strategies you can consider to manage the data more effectively:
DirectQuery Mode: If possible, consider using DirectQuery mode instead of importing the data. This allows Power BI to query the data directly from the source without loading it into memory, which can be beneficial for very large datasets. DirectQuery in Power BI - Power BI | Microsoft Learn
Incremental Refresh: Instead of loading the entire dataset at once, you can set up incremental refresh in Power BI. This allows you to load only new or changed data, which can significantly reduce the load time and improve performance. Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn
Dataflows: Use Power BI Dataflows to preprocess and transform the data before loading it into Power BI. This can help in offloading some of the transformation work and improve performance. Introduction to dataflows and self-service data prep - Power BI | Microsoft Learn
Best Regards,
Community Support Team _Charlotte
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi, @rmcgrath
Here are a few strategies you can consider to manage the data more effectively:
DirectQuery Mode: If possible, consider using DirectQuery mode instead of importing the data. This allows Power BI to query the data directly from the source without loading it into memory, which can be beneficial for very large datasets. DirectQuery in Power BI - Power BI | Microsoft Learn
Incremental Refresh: Instead of loading the entire dataset at once, you can set up incremental refresh in Power BI. This allows you to load only new or changed data, which can significantly reduce the load time and improve performance. Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn
Dataflows: Use Power BI Dataflows to preprocess and transform the data before loading it into Power BI. This can help in offloading some of the transformation work and improve performance. Introduction to dataflows and self-service data prep - Power BI | Microsoft Learn
Best Regards,
Community Support Team _Charlotte
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @rmcgrath - if you're pulling it via an OData feed.Use filtering and pagination to reduce the number of rows being pulled into Power BI.
Solved: Connecting to a large data table in Business Centr... - Microsoft Fabric Community
Using OData with Queries That are Set with a Top Number of Rows - Business Central | Microsoft Learn
Proud to be a Super User! | |
User | Count |
---|---|
118 | |
66 | |
65 | |
56 | |
50 |
User | Count |
---|---|
182 | |
85 | |
67 | |
61 | |
53 |