Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a dashboard to show data from my Azure Storage Table. It was working correctly before June.13. But recently I got an error when auto refreshing data in my semantic model:
DataFormat.Error: OData: The format 'application/json;odata=fullmetadata;streaming=true;charset=utf-8' is not supported.. application/json;odata=fullmetadata;streaming=true;charset=utf-8. </ccon>. The exception was raised by the IDataReader interface. Please review the error message and provider documentation for further information and corrective action.
I tried to refresh the data manually in my Desktop app, it can refresh correctly. Any insights why it only fails in auto/manually refreshing in the sematic model on portal?
Solved! Go to Solution.
Hi @XichengWang , Thank you for reaching out to the Microsoft Community Forum.
Power BI Service has started enforcing stricter Accept headers on OData requests, such as: application/json;odata=fullmetadata;streaming=true;charset=utf-8. Azure Table Storage doesn’t support this format and Power BI Service doesn’t allow overriding these headers, even if you configure OData.Feed() with custom settings in Power Query. That’s why your model still refreshes in Power BI Desktop but fails in the Service, including during manual refreshes.
This issue can’t be resolved by using OData.Feed() with an AccountKey. The connector is no longer reliable for Azure Table Storage in the Power BI Service, due to both unsupported metadata formats and lack of support for AccountKey-based authentication. Power BI Service strips custom headers and only supports OAuth or SAS-based access for OData sources.
The only stable approach is to stop using OData entirely and connect to Azure Table Storage through its REST API using Web.Contents(). This lets you control headers directly and avoids the metadata negotiation that causes refresh failures. If you’re using a SAS token, the connection can be made without needing a gateway, since the token handles authentication through the URL itself.
If you need to use an AccountKey or require custom headers, the recommended solution is to create an Azure Function or small API that reads from Azure Table Storage and returns clean JSON. This avoids all OData and header constraints and refreshes reliably in Power BI Service without requiring a gateway.
If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.
Hi @XichengWang , Thank you for reaching out to the Microsoft Community Forum.
Your issue occurs because the Power BI Service is sending OData requests with a content type that Azure Table Storage does not support. This behaviour seems to have changed after June 13, likely due to an update in how Power BI Service handles OData feeds. Power BI Desktop still works because it uses a different engine that doesn’t enforce the same request headers.
To fix this, you can try modifying your OData query in Power BI Desktop to explicitly request a simpler metadata format. In the Advanced Editor, use:
OData.Feed(
"https://<youraccount>.table.core.windows.net/",
null,
[
ODataVersion = 3,
Headers = [
Accept = "application/json;odata=nometadata"
]
]
)
However, this workaround may not work in the Power BI Service because it often overrides custom headers during refresh. If this doesn’t resolve the issue, you’ll need to bypass the OData feed entirely. The most stable approach is to create an Azure Function (or other REST API proxy) that reads from your Azure Table Storage and returns plain JSON, then connect to it using Web.Contents() in Power BI.
If you want a no-code workaround, you can use Power Automate to periodically pull data from Azure Table Storage via HTTP and push it into a SharePoint list or Dataverse table, then connect your semantic model to that. Alternatively, if you're using Microsoft Fabric, create a Dataflow Gen2 that pulls data from the REST API and lands it into a Lakehouse or Warehouse.
If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.
Thanks for the reply @v-hashadapu !
I'm not quite sure if the fix you mentioned above, which uses OData.Feed instead, works in my scenario. I tried to use OData.Feed before, but currently I access the Azure Table with AccountKey, which seems not supported in OData.Feed. I tried to use the same key in the Web API selection when providing credential for OData feed, but did not work. Any idea if I can use that AccountKey to access the Azure Table with OData.Feed method?
Thank you!
Hi @XichengWang , Thank you for reaching out to the Microsoft Community Forum.
Power BI Service has started enforcing stricter Accept headers on OData requests, such as: application/json;odata=fullmetadata;streaming=true;charset=utf-8. Azure Table Storage doesn’t support this format and Power BI Service doesn’t allow overriding these headers, even if you configure OData.Feed() with custom settings in Power Query. That’s why your model still refreshes in Power BI Desktop but fails in the Service, including during manual refreshes.
This issue can’t be resolved by using OData.Feed() with an AccountKey. The connector is no longer reliable for Azure Table Storage in the Power BI Service, due to both unsupported metadata formats and lack of support for AccountKey-based authentication. Power BI Service strips custom headers and only supports OAuth or SAS-based access for OData sources.
The only stable approach is to stop using OData entirely and connect to Azure Table Storage through its REST API using Web.Contents(). This lets you control headers directly and avoids the metadata negotiation that causes refresh failures. If you’re using a SAS token, the connection can be made without needing a gateway, since the token handles authentication through the URL itself.
If you need to use an AccountKey or require custom headers, the recommended solution is to create an Azure Function or small API that reads from Azure Table Storage and returns clean JSON. This avoids all OData and header constraints and refreshes reliably in Power BI Service without requiring a gateway.
If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.
Check out the July 2025 Power BI update to learn about new features.