Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now

Reply
Kevin8
Frequent Visitor

Problems with importing data from AWS Redshift

Hi everyone,
 

I have a Power BI Report called stored in a Power BI Workspace within my work organsiation. It is set to refresh weekly on Monday mornings.

The refresh fails and the reason for the failure given is that the report size is too large and takes up too much memory:

Data source error: Resource Governance: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 25518 MB, memory limit 25514 MB, database size before command execution 85 MB. See Troubleshoot XMLA endpoint connectivity in Power BI - Microsoft Fabric to learn more.

However a copy of this report does refresh from another workspace at the same time. Both workspaces have the same license configuration and capacity.

Further, there is a near identical report that brings in a larger dataset that continues to refresh without any issues. This report sources its data from a data warehouse internal to the organisation.

Also there is another report in the same workspace, which is a larger size and that brings in a larger dataset without any refresh issues from AWS Redshift.

For reference the report that fails is listed as (495mb) in size although the actual file size is 265,100 kb. 
The 2nd report in the same workspace that gets data internally is (748mb)
The 3rd report in the same workspace that gets data from Redshift without issue is (501mb)

8 REPLIES 8
tayloramy
Super User
Super User

Hi @Kevin8

 

Switching to large semantic model storage might help. 

Is your 11m row fact table a full load on refresh or is it incremental? If it is a full load, you may want to look at making it incremental.  





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Proud to be a Super User!





Kevin8
Frequent Visitor

Hi Tayloramy,

Its a Fabric64 workspace.  Are there any other specs that you need? 

cengizhanarslan
Super User
Super User

Do you have Calculated Columns or Power Query transformation steps that disables native query option? If yes, even the refreshed datasets size is lower then the max limit, the size of the model coult exceed the max limit during the refresh process. Consider making all those transformations on the source level which is AWS in your case.

_________________________________________________________
If this helped, ✓ Mark as Solution | Kudos appreciated
Connect on LinkedIn | Follow on Medium
AI-assisted tools are used solely for wording support. All conclusions are independently reviewed.

Hi Cengizhanarslan,

I have shrunk the report as much as possible, to fold the data to the source so that the data transformations are done in the sql query and the power query steps are reduced as much as possible.

However, before I did this the error message was the same, the report was a fraction too big, and needed to be made slightly smaller to work.

After massively reducing the report size, the same error message occured. This leads me to conclude it is not a Power BI memory issue. The same error message is given regardless of the report size.

Any ideas?

 

Hi @Kevin8

 

It's not the filesize of the pbix file that is the problem here, it's how much the data expands during refresh.  

For an F64 capacity, the max memory available is 25GB, which is the limit you're hitting. 

 

Can you describe more about what data sources you're using and what sort of transformations are being done?





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Proud to be a Super User!





Hi,
There are 29 queries. Its a star schema with one large fact table and many small dimension tables. The dmiension tables might have some power query steps such as columns removed or renamed. Or a key column added. Several of these have 5 - 10 rows. The largest has 9 columns and 750k rows , the next largest has 4 columns and 500k rows. The next has 5 columsn and 26k rows.

The main fact table has 88 columns and 11.5 m rows.  It has a renamed columns step, and an append query where it adds an 88 row table.






tayloramy
Super User
Super User

Hi @Kevin8

 

According to the error your report is expanding to over 25GB during refresh. 

 

What type of workspace do you have? Is it a pro workspace, or a premium workspace attached to a capacity? If it is attached to a capacity, which capacity sku do you have?  





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Proud to be a Super User!





Hi Tayloramy,

The capacity SKU is F64. The workspace also says:

Semantic model storage format Small semantic model storage format.

Helpful resources

Announcements
Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Power BI Update Carousel

Power BI Community Update - March 2026

Check out the March 2026 Power BI update to learn about new features.

Top Kudoed Authors