Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hi folks,
I have a Power BI report with a pretty big dataset.
The fact table is about 22 million rows with 23 columns and some minor dimension tables.
Pretty nice star schema though.
However I could optimze the file size by removing unnessecary coulumns, rounding numbers, sorting by the most expensive column, etc.
I was kind of proud to reduce the filesize from a few houndred MB to 40MB by this optimizations.
I even put everything right in the sql statement because I use Snowflake as source and I know there are some problems with query folding on snowflake.
However we have F8 capacity and when I reload the dataset in the service it crashes because the in memory size goes above 3000MB.
I really don't want to upgrade the capacity just because of this report.
I cannot reduce rows or columns anymore.
Is there anything else I can do?
Error in Service:
Memory while refreshing locally:
Solved! Go to Solution.
Hi. You can consider two things here. First, try it on a shared capacity instead of the Fabric one. The pro limits are at data model size (1gb) or single table memory usage. Maybe you can go through that way. As an alternative you could consider mirroring the snowflake and build the data model at a lakehouse. That way you could be using direct lake for the connection to power bi without the need of a refresh.
I hope that helps,
Happy to help!
@ibarrau thank you for your suggestions
I tried it in a Power BI Pro Workspace and it worked.
So according to https://learn.microsoft.com/en-us/power-bi/enterprise/service-admin-premium-workloads the "Offline Semantic Model Size" is max 3GB for a F8 Capacity. But I always thought this was just the Power BI file size and not the expanded model.
So how is the limit higher in a normal Pro Workspace?
And why am I paying thousands of bucks for the capacity when a normal pro workspace has higher limits?
I'm glad it worked. That fixes your problem and solves the question.
Regarding this concern, Capacity has different limits than PRO and different features. Lower capacities might have similarities with PRO and not necessary be better. With a dedicated capacity you have a lot of features. That's for sure. But at lower capacities you might have benefits like a storing more than 1 gb data model size but you have memory limitations rather than pro that will limit the storage data model size and single table memory instead the whole model. Paying for capacity should consider many details and not jusst the semantic model memory, there are a lot of benifits included and you could be using direct lake.
I hope that make sense.
Happy to help!
Hi. You can consider two things here. First, try it on a shared capacity instead of the Fabric one. The pro limits are at data model size (1gb) or single table memory usage. Maybe you can go through that way. As an alternative you could consider mirroring the snowflake and build the data model at a lakehouse. That way you could be using direct lake for the connection to power bi without the need of a refresh.
I hope that helps,
Happy to help!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
58 | |
30 | |
26 | |
21 | |
20 |
User | Count |
---|---|
63 | |
48 | |
24 | |
24 | |
19 |