Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.
I'm trying to refresh a table in Power BI dataset through XMLA endpoint in SSMS. I'm getting this error. "Stream does not support reading. The remote server returned an error: (400) Bad Request." The screenshot is the exact error message I see.
A few points:
- I can refresh this exact same table with fewer data, but when it's full data, it gives me the error.
- I'm using a sql native query for the table
Has anybody experienced this? What would be the cause for this?
Thanks in advance!
Is your table hitting the 10GB partition limit? Or maybe the capacity admin has specified a much smaller dataset limit?
The thing is I'm able to refresh the data in Power BI desktop. The table itself only contains 10 million rows with 10 columns. But the native query is somewhat complex and long. Maybe it's the query generating big materializations at intermidiate steps that go over a certain limit...?
that is certainly possible. maybe you can change the timeout of the native query.
Yeah I have set it at 3 hrs and runs fine in PBI desktop. It's sad ssms doesn't give a useful message in my case
Which refresh type did you specify in the XMLA? and did you target an individual partition?
I've tried both. They give me the same error though 😅
You can raise an issue at https://community.fabric.microsoft.com/t5/Issues/idb-p/Issues . If you have a Pro license you can consider raising a Pro ticket at https://admin.powerplatform.microsoft.com/newsupportticket/powerbi
how long before it errors out?
It's between 12-17minutes. Pretty consistent
Is this dataset on a premium capacity? If so, what SKU? Maybe the table is too large for the SKU?
It's on premium capacity and it should be P3