Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello,
We have a PPU license with A2 capacity and our dataset in Power BI service won't refresh because it reaches the memory limit.
We cannot implement incremental refresh because of business requirement and all data needs to be refreshed.
I believe the only option would be partitioning.
I created multiple partitions for a single table using Tabular Editor by splitting the data based on date. Among our 25 tables, two of them contain large datasets, each exceeding 12 million rows.
For example:
Added this for first partition:
#"Filtered Rows" = Table.SelectRows(#"Removed Columns", each [CreatedOn] >= #datetime(2024, 1, 1, 0, 0, 0))
Added this for second partition:
#"Filtered Rows" = Table.SelectRows(#"Removed Duplicates", each [CreatedOn] >= #datetime(2023, 1, 1, 0, 0, 0) and [CreatedOn] <= #datetime(2023, 12, 31, 0, 0, 0))
The first partition refreshes successfully through a powershell script.
However, the second one results in an error falsely indicating a duplicate in the ID column
After testing, I noticed that excluding the closing date allows the dataset to refresh without issue. For example, when specifying only '[CreatedOn] >= #datetime(2023, 1, 1, 0, 0, 0)' for the second partition, the dataset refreshes successfully. But, when I include the closing date, I face the duplication error message.
I appreciate it if you could help me out with that or if you have any other options in mind to create and refresh all partitions.
Thanks.
Solved! Go to Solution.
There's that pesky problem of meta data changes. Whenever you change the data model, or even the report design, and publish to the workspace without further thought, a couple of things will happen.
- The incremental refresh partitions will be replaced with the single partition you just uploaded
- a full refresh will be initiated, across all currently specified partitions (depending on your granularity)
- after that is completed successfully your regular incremental refresh will resume.
To avoid that you need to choreograph meta data changes extremely carefully, push them with ALM Toolkit WITHOUT refresh, and then manually refresh the affected partitions.
If you want it or not, you have to emulate the standard incremental refresh process to the letter. That includes the fact that your RangeStart should be inclusive and the RangeEnd should be exclusive. So the partition formula for the first partition is missing the RangeEnd filter. and the second one has the wrong condition and value for RangeEnd.
As for the indexes being duplicates - that's for you to avoid, for example by adding a large enough year multiplier to the index column to make sure they can never be the same across partitions.
Think of the RangeEnd of a given partition as being equal to the RangeStart of the next partition.
Thank you. I think I should do it incrementally. If I define an incremental refresh policy in tabular editor to refresh last 6 months and archive data from last 20 months, then 20 partitions will be created with the newest partition representing February 2024. Will a new partition be automatically created when the next month begins?
yes, and the oldest partition will be dropped. If you don't want that, use a different policy, like for example 6 months and 3 years.
Thanks a lot. I'm stuck in this and appreciate your idea for the following approaches:
First approach:
Second approach:
I need to connect to live database in order to test if refreshing works through my partitions.
By the way, can I use the automatic refresh type to refresh only the partitions where data changes occur?
Thanks a lot!
Guess you want to tune your live database a bit. Make sure the indexes and statistics are updated.
What does the memory capacity error say exactly? Usually refresh failures are due to timeouts, not to memory issues.
By the way, can I use the automatic refresh type to refresh only the partitions where data changes occur?
That would be nice, wouldn't it? However that's not how incremental refresh works. If you try to use "detect data change" you will be hit with a massive, massive penalty in the form of a canary dataset, basically doubling your storage and compute needs.
No. it's all you. You the human need to know which partitions may need refreshing, and you need to then selectively refresh them (and their dependents)
Thank you so much.
Yes, we are reducing the size of the database to improve performance.
The error message says: "This operation was canceled because there wasn't enough memory to finish running it."
By connecting PBI dataset to the 30-month DB, and then partitioning it in tabular editor, followed by refreshing each partition separately using PowerShell (XMLA endpoint), in the first try, since the partitions are empty, it took around 4 minutes to refresh each partition, but subsequent refreshes took less than 20 seconds (all using automatic refresh type). This led me to think that using the automatic refresh type would be similar to employing "detect data change".
I always get that "not enough memory" error in PBI service, and I'm trying to divide all tables in multiple PowerShell batches and partitioning the large tables and refreshing partitions sequentially to avoid that error.
Thanks again for the comment.
There's that pesky problem of meta data changes. Whenever you change the data model, or even the report design, and publish to the workspace without further thought, a couple of things will happen.
- The incremental refresh partitions will be replaced with the single partition you just uploaded
- a full refresh will be initiated, across all currently specified partitions (depending on your granularity)
- after that is completed successfully your regular incremental refresh will resume.
To avoid that you need to choreograph meta data changes extremely carefully, push them with ALM Toolkit WITHOUT refresh, and then manually refresh the affected partitions.
Check out the July 2025 Power BI update to learn about new features.
User | Count |
---|---|
75 | |
72 | |
39 | |
29 | |
27 |
User | Count |
---|---|
97 | |
96 | |
58 | |
44 | |
40 |