Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!
Hello,
I am working on getting a Dataflow to pull data from a Web API via an On-premises Gateway. The Web API is returning ~2M rows of data, ~1.5 GB in size in ~3 minutes. The Web API applies a significant amount of business logic and security settings which must be applied when extracting the data. While the data is "tall" and requires the use of an HTTP Content Stream I would not consider it to be all that large for the x64 Gateway process to consume. When I refresh the Dataflow, the Gateway fails and the following stack trace is found in the "GatewayErrors20200224.00000.log" file:
[The full stack trace exceeds the 20,000 character limit]
GatewayPipelineErrorCode=DM_GWPipeline_UnknownError
GatewayVersion=
InnerType=ErrorException
InnerMessage=<pi>Evaluation ran out of memory and can't continue.</pi>
InnerToString=<pi>Microsoft.Mashup.Evaluator.Interface.ErrorException: Evaluation ran out of memory and can't continue. ---> Microsoft.Mashup.Evaluator.Interface.ErrorException: Evaluation ran out of memory and can't continue. ---> Microsoft.Mashup.Evaluator.Interface.ErrorException: Evaluation ran out of memory and can't continue. ---> Microsoft.Mashup.Evaluator.Interface.ErrorException: Array dimensions exceeded supported range. ---> System.OutOfMemoryException: Array dimensions exceeded supported range. ---> System.OutOfMemoryException: Array dimensions exceeded supported range.
at Microsoft.Mashup.Engine1.Library.Lines.TextReaderLineReader.MoveNext()
at Microsoft.Mashup.Engine1.Runtime.StreamedListValue.GetReference(Int32 index)
at Microsoft.Mashup.Engine1.Runtime.StreamedListValue.get_Item(Int32 index)
at Microsoft.Mashup.Engine1.Language.InstructionInvocationInstruction2.Execute(MembersFrame1& frame)
at Microsoft.Mashup.Engine1.Language.MembersFunctionValue1.Invoke(Value arg0)
... 100 more lines of stack trace below here...
The C# logic in the ASP.NET (.NET 4.7.2) Web API that "writes" the serialized JSON to the Content stream is as follows:
protected HttpResponseMessage ProcessStreamResponse<TResult>( Func<TResult> method )
{
var response = Request.CreateResponse();
response.Content = new PushStreamContent(
( outputStream, httpContent, transportContext ) =>
{
var serializer = new JsonSerializer();
using( var writer = new StreamWriter( outputStream ) )
{
serializer.Serialize( writer, method() );
}
}, new MediaTypeHeaderValue( "applicaton/json" ) );
return response;
}
The Power Query logic consuming the API contents is as follows (AccessToken retrieval has been truncated for clarity):
DataRequestHeaders = ( _accessToken as text ) as record =>
[
#"Authorization" = "Bearer " & _accessToken,
#"Content-Type" = "application/json; charset=utf-8"
],
DataRequestParameters = [
RelativePath = "customers",
Headers = DataRequestHeaders( AccessToken ),
Timeout = #duration(0,1,0,0),
ExcludedFromCacheKey={"Authorization"},
ManualStatusHandling = { 500 },
IsRetry = true
],
WebResult = Web.Contents( "https://myUrl/", DataRequestParameters ),
UnpackedStream = Lines.FromBinary( WebResult ) {0},
DataRequestResult = Json.Document(UnpackedStream),
RecordTable = Table.FromList(DataRequestResult, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
ResultTable = Table.ExpandRecordColumn( RecordTable, "Column1",{ "ID","Customer" /* 20 additional columns */ })
I am able to pull the exact same data to the Dataflow using a SQL connection so it seems like the Web.Contents call is introducing some sort of array length limitation. The stack trace shows that a StreamedListValue is being accessed via an Int32 value which may be problematic? Is there a different\better way to deal with tall (2M+) tables coming from a Web API? Any ideas or information would be greatly appreciated. Thanks!
Jason
Solved! Go to Solution.
It turns out this was a case of bad Power Query logic. I removed:
UnpackedStream = Lines.FromBinary( WebResult ) {0},So that my Power Query logic is:
DataRequestHeaders = ( _accessToken as text ) as record =>
[
#"Authorization" = "Bearer " & _accessToken,
#"Content-Type" = "application/json; charset=utf-8"
],
DataRequestParameters = [
RelativePath = "customers",
Headers = DataRequestHeaders( AccessToken ),
Timeout = #duration(0,1,0,0),
ExcludedFromCacheKey={"Authorization"},
ManualStatusHandling = { 500 },
IsRetry = true
],
WebResult = Web.Contents( "https://myUrl/", DataRequestParameters ),
DataRequestResult = Json.Document(WebResult),
RecordTable = Table.FromList(DataRequestResult, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
ResultTable = Table.ExpandRecordColumn( RecordTable, "Column1",{ "ID","Customer" /* 20 additional columns */ }) And I am no longer encountering any issues. Thanks,
It turns out this was a case of bad Power Query logic. I removed:
UnpackedStream = Lines.FromBinary( WebResult ) {0},So that my Power Query logic is:
DataRequestHeaders = ( _accessToken as text ) as record =>
[
#"Authorization" = "Bearer " & _accessToken,
#"Content-Type" = "application/json; charset=utf-8"
],
DataRequestParameters = [
RelativePath = "customers",
Headers = DataRequestHeaders( AccessToken ),
Timeout = #duration(0,1,0,0),
ExcludedFromCacheKey={"Authorization"},
ManualStatusHandling = { 500 },
IsRetry = true
],
WebResult = Web.Contents( "https://myUrl/", DataRequestParameters ),
DataRequestResult = Json.Document(WebResult),
RecordTable = Table.FromList(DataRequestResult, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
ResultTable = Table.ExpandRecordColumn( RecordTable, "Column1",{ "ID","Customer" /* 20 additional columns */ }) And I am no longer encountering any issues. Thanks,
The Power BI Data Visualization World Championships is back! It's time to submit your entry.
Check out the January 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 22 | |
| 15 | |
| 10 | |
| 7 | |
| 4 |