The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
For our Power BI connector, we have very large responses that we need to stream part-by-part in an HTTP multi-part response; this would greatly improve performance as we would be able to send each chunk of data as soon as it becomes available.
The response, conforming to HTTP multipart/mixed spec, looks like this in its raw form:
---
content-type: application/json; charset=utf-8
{"hasNext":true,"data":{"books":[]}}
---
content-type: application/json; charset=utf-8
{"hasNext":true,"incremental":[{"path":["books",0],"items":[{"title":"Things Fall Apart","author":"Chinua Achebe"}]}]}
The server streams multiple parts over time with separators and content-type info between each part.
If you use Web.Contents, it just returns the whole string displayed above instead of interpreting the multipart response correctly. We could do text processing on that string, but that defeats the point. Is there a way to handle multipart responses in Power Query so that each part can be processed in a streaming manner?
What is the original response code? Remember you have this:
Handling status codes with Web.Contents for Power Query connectors - Power Query | Microsoft Learn
The Web.Contents
function has some built-in functionality for dealing with certain HTTP status codes. The default behavior can be overridden in your extension using the ManualStatusHandling
field in the options record.
Hi - Thanks for the reply. I looked into this, and for this multipart/mixed response, there is just a HTTP 200 status code at the beginning, and futher parts arrive with the separator without an additional HTTP status code.
So I think manual status code handling would not work with the multipart/mixed response format. Perhaps it would help if partial responses came with HTTP 206 Partial Content statuses, and it may be worth researching that, but that would be a different protocol from what we're currently considering.