Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
whiteBirdie_258
Regular Visitor

Data refresh failure, connecting to Google Big Query

My datasource is Google BigQuery, and I have scheduled Auto refresh weekly, but it failed constantly starting from a few days ago, the error message says this: [Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 16155 MB, memory limit 15405 MB, database size before command execution 978 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more.]

 

From reading other similar posts, I know that it is because when Power BI does data refresh, it takes the uncompressed dataset, the compressed old set and the compressed new set so the memory is significantly higher than the compressed size of the model.

 

I am having Pro license, and I am not planning to pay for Premium (no money), so I am stuck with 1GB memory for refresh, nor can I click the "Enable large dataset button", nor can I use the Metrics app to inspect my data refresh process.

 

When stuck with Pro, I saw there are online tutorials teaching viusalization of refresh process using SSMS sql server profiler to find memory spikes for us to optimize data model and calculated columns and stuff, but I cannot connect to the sql server of my dashboard, and I suspect it is because the datasource is from Google BigQuery.

 

My question is, is there other ways to solve this data refresh failure without upgrading BI license?

3 REPLIES 3
ReportGuru
Post Patron
Post Patron

Hi @whiteBirdie_258 I understand the frustration with the memory limitations while using the Pro license of Power BI and the challenges in refreshing large datasets from Google BigQuery. Here are a few suggestions that might help you resolve this issue without needing to upgrade to a Premium license:

Possible Solutions

  1. Data Reduction:

    • Consider reducing the data being imported by applying filters or aggregating the data at the source. This can significantly decrease the memory footprint during the refresh process.
  2. Incremental Refresh:

    • If possible, set up incremental refresh policies to only import new or changed data rather than refreshing the entire dataset.
  3. Data Transformation:

    • Use Power Query to preprocess and clean your data before loading it into the Power BI model. This can help in reducing the overall size of the dataset.

Alternative Solution

If you are looking for a more streamlined and efficient way to manage your data between Google BigQuery and Power BI, you might consider using third-party data integration tools. For instance, Windsor.ai offers a robust solution for connecting Google BigQuery with Power BI. It allows you to preprocess and optimize your data before importing it into Power BI, which can help in avoiding memory limit issues. Using such solutions can help you maintain efficient data refreshes without the need for a Premium license, saving both time and resources.

Hope this helps! 

ReportRanger
Helper III
Helper III

Hi @whiteBirdie_258 It sounds like you're facing a common challenge when dealing with data refreshes in Power BI, especially with memory limitations on a Pro license. Here are some strategies that might help you mitigate this issue without needing to upgrade to a Premium license:

Optimize Your Data Model

  1. Reduce Data Volume:

    • Query Folding: Ensure that your queries are being folded back to BigQuery, so the data transformation happens on the server side. This reduces the amount of data transferred and processed in Power BI.
    • Filtering: Apply more aggressive filtering in your data source queries to reduce the dataset size.
    • Aggregations: Use aggregations to reduce the detail level of your data.

  2. Optimize Columns:

    • Remove unnecessary columns and only load the data that you need for your reports.
    • Use data types that are more memory-efficient (e.g., integers instead of strings where possible).

  3. Calculated Columns and Measures:

    • Move calculated columns and complex transformations to BigQuery where possible. This can reduce the memory footprint in Power BI.

Alternative Solutions

  • Incremental Refresh: If your dataset supports it, set up incremental refreshes. This way, only new or changed data is processed rather than the entire dataset.
  • Split the Dataset: If feasible, split your dataset into smaller chunks and process them separately.

Explore Other Tools

Since you are not planning to upgrade to a Premium license, you might also consider exploring other data integration tools that can handle large datasets more efficiently. For instance, Windsor.ai offers data connectors that might help streamline your data processes and reduce memory consumption when exporting your data into Power BI

I hope these can be helpful for you! 

Anonymous
Not applicable

Hi @whiteBirdie_258 ,

 

It doesn't seem to be a particularly good way to try data migration and change the data source. Connect via Direct Query or Live Connection. Or change all calculated columns to measure implementation.

 

Hope it helps!

 

Best regards,
Community Support Team_ Scott Chang

 

If this post helps then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors