Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Did you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now

Reply
Kevin8
Frequent Visitor

Problems with importing data from AWS Redshift

Hi everyone,
 

I have a Power BI Report called stored in a Power BI Workspace within my work organsiation. It is set to refresh weekly on Monday mornings.

The refresh fails and the reason for the failure given is that the report size is too large and takes up too much memory:

Data source error: Resource Governance: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 25518 MB, memory limit 25514 MB, database size before command execution 85 MB. See Troubleshoot XMLA endpoint connectivity in Power BI - Microsoft Fabric to learn more.

However a copy of this report does refresh from another workspace at the same time. Both workspaces have the same license configuration and capacity.

Further, there is a near identical report that brings in a larger dataset that continues to refresh without any issues. This report sources its data from a data warehouse internal to the organisation.

Also there is another report in the same workspace, which is a larger size and that brings in a larger dataset without any refresh issues from AWS Redshift.

For reference the report that fails is listed as (495mb) in size although the actual file size is 265,100 kb. 
The 2nd report in the same workspace that gets data internally is (748mb)
The 3rd report in the same workspace that gets data from Redshift without issue is (501mb)

1 ACCEPTED SOLUTION
tayloramy
Super User
Super User

Hi @Kevin8

 

Switching to large semantic model storage might help. 

Is your 11m row fact table a full load on refresh or is it incremental? If it is a full load, you may want to look at making it incremental.  





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Join the Fabric Discord!

Proud to be a Super User!





View solution in original post

12 REPLIES 12
Kevin8
Frequent Visitor

Unfortunately changing the workspace setting to a large semantic model, didn't fix the issue, and the report continues to fail when trying to refresh in service with the error message siting not enough memory.

I will try another round of to reducing the report in size.  

Hello @Kevin8,
Thank you for the update.
Reducing the report size is a good approach, and hopefully that helps resolve your issue.

If you continue to experience the issue after making those changes, please let us know, we’ll be happy to assist further. Your update will be valuable to the community and may assist others with similar concerns.

Thank you for being part of the Microsoft Fabric Community.

 

v-ssriganesh
Community Support
Community Support

Hello @Kevin8,

We hope you're doing well. Could you please confirm whether your issue has been resolved or if you're still facing challenges? Your update will be valuable to the community and may assist others with similar concerns.

Thank you.

v-ssriganesh
Community Support
Community Support

Hi @Kevin8,

Thank you for posting your query in the Microsoft Fabric Community Forum, and thanks to @tayloramy & @cengizhanarslan for sharing valuable insights.

 

Could you please confirm if your query has been resolved by the provided solutions? This would be helpful for other members who may encounter similar issues.

 

Thank you for being part of the Microsoft Fabric Community.

tayloramy
Super User
Super User

Hi @Kevin8

 

Switching to large semantic model storage might help. 

Is your 11m row fact table a full load on refresh or is it incremental? If it is a full load, you may want to look at making it incremental.  





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Join the Fabric Discord!

Proud to be a Super User!





Kevin8
Frequent Visitor

Hi Tayloramy,

Its a Fabric64 workspace.  Are there any other specs that you need? 

cengizhanarslan
Super User
Super User

Do you have Calculated Columns or Power Query transformation steps that disables native query option? If yes, even the refreshed datasets size is lower then the max limit, the size of the model coult exceed the max limit during the refresh process. Consider making all those transformations on the source level which is AWS in your case.

_________________________________________________________
If this helped, ✓ Mark as Solution | Kudos appreciated
Connect on LinkedIn | Follow on Medium
AI-assisted tools are used solely for wording support. All conclusions are independently reviewed.

Hi Cengizhanarslan,

I have shrunk the report as much as possible, to fold the data to the source so that the data transformations are done in the sql query and the power query steps are reduced as much as possible.

However, before I did this the error message was the same, the report was a fraction too big, and needed to be made slightly smaller to work.

After massively reducing the report size, the same error message occured. This leads me to conclude it is not a Power BI memory issue. The same error message is given regardless of the report size.

Any ideas?

 

Hi @Kevin8

 

It's not the filesize of the pbix file that is the problem here, it's how much the data expands during refresh.  

For an F64 capacity, the max memory available is 25GB, which is the limit you're hitting. 

 

Can you describe more about what data sources you're using and what sort of transformations are being done?





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Join the Fabric Discord!

Proud to be a Super User!





Hi,
There are 29 queries. Its a star schema with one large fact table and many small dimension tables. The dmiension tables might have some power query steps such as columns removed or renamed. Or a key column added. Several of these have 5 - 10 rows. The largest has 9 columns and 750k rows , the next largest has 4 columns and 500k rows. The next has 5 columsn and 26k rows.

The main fact table has 88 columns and 11.5 m rows.  It has a renamed columns step, and an append query where it adds an 88 row table.






tayloramy
Super User
Super User

Hi @Kevin8

 

According to the error your report is expanding to over 25GB during refresh. 

 

What type of workspace do you have? Is it a pro workspace, or a premium workspace attached to a capacity? If it is attached to a capacity, which capacity sku do you have?  





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Join the Fabric Discord!

Proud to be a Super User!





Hi Tayloramy,

The capacity SKU is F64. The workspace also says:

Semantic model storage format Small semantic model storage format.

Helpful resources

Announcements
April Power BI Update Carousel

Power BI Monthly Update - April 2026

Check out the April 2026 Power BI update to learn about new features.

Fabric SQL PBI Data Days

Data Days 2026 coming soon!

Sign up to receive a private message when registration opens and key events begin.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.