Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
I have a table - [WorkItems] - 245 000 records, 90 MB original size - which is part of PBI Report.
It contains 30 API references (pulling data directly from API - 30 OData Feeds (Projects)).
Code written in PowerQuery (OData)
When I am refreshing this PBI report, the [WorkItems] table keeps failing with the following msg -
(The Refresh time is over 10 min)
I was able to lower [WorkItems] tbl size by removing about half of the API calls (OData feeds) - left 12 out of 30.
Then the tbl size reduced to 42MB, and it's now about 100 000 records and refresh is 2 min
(3 min when I publish it on PBI Services; scheduled refresh is now not failing).
But - my problem - I need to keep ALL 30 OData Feeds (Projects)...
To achieve this:
- Are there working and more or less quick ways to reduce this tbl size?
- Should I add some sort of parameter(s) into Data Source settings - in order to speed up the refresh
(to load my records in parts, quicker)?
- One other possible option is to add [Date] filter to reduce a tbl size - [Current Date] - 24 months
I am allowed to do this. (Not sure where in the code and how exactly)?
Here is the code (PowerQuery, OData, I included only 3 out of 30 OData feeds):
Any suggestions would be very helpful
// "$select=ParentWorkItemId, StoryPoints, State, WorkItemType, Title, IterationSK, AreaSK, WorkItemId" & "&$filter=(WorkItemType eq 'Bug' or WorkItemType eq 'User Story')", null, [Implementation="2.0"]),
let
Source = OData.Feed("https://analytics.dev.azure.com/MyCompany/Research and
Development/_odata/v3.0-preview/WorkItems?" &
"$select=ParentWorkItemId, StoryPoints, State,
WorkItemType, Title, IterationSK, AreaSK, WorkItemId,
Area" & "&$filter=(WorkItemType eq 'Bug' or
WorkItemType eq 'User Story')" &
"&$expand=Area($select=AreaPath)", null,
[Implementation="2.0"]),
#"Add all Ops & CP projects" = Table.Combine({
Source,
OData.Feed("https://analytics.dev.azure.com/MyCompany/Cloud
Platform/_odata/v3.0-preview/WorkItems?" &
"$select=ParentWorkItemId, StoryPoints, State, WorkItemType,
Title, IterationSK, AreaSK, WorkItemId, Area" & "&$filter=
(WorkItemType eq 'Bug' or WorkItemType eq 'User Story')" &
"&$expand=Area($select=AreaPath)", null,
[Implementation="2.0"]),
OData.Feed("https://analytics.dev.azure.com/MyCompany/Batch
Management/_odata/v3.0-preview/WorkItems?" &
"$select=ParentWorkItemId, StoryPoints, State, WorkItemType,
Title, IterationSK, AreaSK, WorkItemId, Area" & "&$filter=
(WorkItemType eq 'Bug' or WorkItemType eq 'User Story')" &
"&$expand=Area($select=AreaPath)", null,
[Implementation="2.0"]),
}),
#"Add AreaPath" = Table.ExpandRecordColumn(#"Add all Ops & CP projects",
"Area", {"AreaPath"}, {"AreaPath"}),
// Calculate the date 24 months ago from the current date
Date24MonthsAgo = Date.AddMonths(DateTime.LocalNow(), -24),
// Filter data to include only records from the last 24 months
FilteredData = Table.SelectRows(ConvertedIDColumns, each
DateTime.From([CreatedDate]) >= Date24MonthsAgo),
#"Rename Story Points to Effort" = Table.RenameColumns(#"Add AreaPath",{{"StoryPoints", "Effort"}}),
#"Add Organization" = Table.AddColumn(#"Rename Story Points to Effort", "Organization", each "MyCompany"),
#"Change IDs to text" = Table.TransformColumnTypes(#"Add Organization",{{"WorkItemId", type text}, {"ParentWorkItemId", type text}}),
#"Make IDs unique" = Table.TransformColumns(
#"Change IDs to text",
{
{
"WorkItemId",
each
Text.Combine({(_),"-VSTS"}),
type text
}
}
),
#"Make Parent IDs unique" = Table.TransformColumns(
#"Make IDs unique",
{
{
"ParentWorkItemId",
each
Text.Combine({(_),"-VSTS"}),
type text
}
}
),
#"Replaced Value" = Table.ReplaceValue(#"Make Parent IDs unique","-VSTS","",Replacer.ReplaceValue,{"ParentWorkItemId"}),
#"Parent Orphans to ""No Feature""" = Table.ReplaceValue(#"Replaced Value","","No Feature",Replacer.ReplaceValue,{"ParentWorkItemId"})
in
#"Parent Orphans to ""No Feature"""
I think this issue is due to the Azure Devops API limitation and not because of Power BI. May be you can split this into multiple tables and refresh them one after the other then you will not hit the max limit.
If the post helps please give a thumbs up
If it solves your issue, please accept it as the solution to help the other members find it more quickly.
Tharun
Check out the October 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.