Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi
I'm using PB over redshift db
I created a report pbix file on the desktop change the connection setting from DirectQuery to Import
then I published it and updated the credentials to the Redshift db (cloud).
data ~40M rows
pbix file size ~110Mg
it works fine in the web too until trying to refresh on the web then it is fails with this error
any idea how to handle it ?
Thanks
Ziv
Solved! Go to Solution.
Hi
I created a agg table having less and only required fileds for analysis without passing the rows limit.
I made two tables/view for seperate data set, that cover what mainly was required for fast response
Works great for my needs
Good luck
Ziv
Hi @ziv,
Is Redshift an on-premise database hosted on your machine? Did you add this data source to gateway to enable scheduled refresh?
What is the result if you refresh dataset (import mode) manually in desktop? If you connect to Redshift with DirectQuery, does issue persist?
From above error prompt, it looks like it was the heavy data load that caused the failure of power query execution. Please check whetehr this similar thread helpful to your scenario.
Regards,
Yuliana Gu
Hi Yuliana
1. Redshift is on cloud columnar database service by Amazon. as I know there is no need of gateway on cloud
2. doing it manually on desktop, the data is updated and I can work with it as needed.
with DirectQuery there is no issue on desktop/web
3. I don't see a solution in the link attached, also I'm loading an aggregated data with only requried data.
In addition trying to refresh the imported data on web with only 100 rows it works with no errors,
Thanks
Ziv
Did you find a solution? I am facing the same problem with auto refresh. Also, when I refresh the desktop i get the same error. However, if I refresh each table one-by-one on desktop it works fine.
Thanks Gilbert, but unfortunately thats not the problem. I do have "Version: 2.56.5023.1021 64-bit (March 2018)" installed.
Can someone please help me 😞
PBI refreshes fine when I manually refresh each table but throws out-of-memory error on a "bulk" refresh. I even disabled "Parallel loading of tables" in options.
Hi GilbertQ, i hae 64 bit installed and not 32 bit.
On top is the process: "CefSharp.BrowserSubprocess"
I have quite a few visuals on the page and most of them are custom. (page 01 custom visuals: Calendar visual, cards with states, chiclet slicers, quadrant chart. page 02 custom visuals: shuffle stack, chloropeth map, chiclet slicers and Hirearchy slicer.)
No. I removed all custom visuals and few default visuals and left the page with just 3-4 default visuals. I still get the same error. When I individually refresh the tables it always works fine.
Hi
I created a agg table having less and only required fileds for analysis without passing the rows limit.
I made two tables/view for seperate data set, that cover what mainly was required for fast response
Works great for my needs
Good luck
Ziv
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!