Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Ref: Get Azure DevOps work item daily historical data u... - Microsoft Fabric Community
About two years ago, someone asked this question in the community about getting the daily granular data from Azure DevOps and was told that it is not possible. But it's been two years, and sometimes things improve/change. So I thought I'd ask the question again.
There is an example as to how to integrate with power bi here from Microsoft, however this does not work within the Premium capacity for Power BI Server within a dataflow. Has anyone had any luck - or should this be posted in a different area within the community?
Thank you
Solved! Go to Solution.
Hi @StephenRabobank -If you need daily granularity for work items, you will likely need to create an external process to extract and store data daily since Azure DevOps doesn’t provide this natively.If the Analytics Views OData feed meets your needs, use it to avoid complex ETL pipelines.If you're using Premium capacity, focus on external staging solutions (e.g., SQL Database or Azure Data Lake) to store snapshots and integrate with Power BI.
Using Power BI Desktop to connect to the data (via REST API or Analytics Views), then publish the dataset to the Power BI Service or Premium capacity.
Or, export and transform the data externally before pushing it to the Dataflow.
Proud to be a Super User! | |
Thanks. It seems like a lack in the Premium Capacity then.
It also seems like there is the ability natively to get the daily/weekly granularity withing Azure DevOps Natively using the Analytics View. As I can connect to the Analytics view in the desktop client (as you pointed out), I would like I should be able to do so in the Premium Capacity. They are both Microsoft Products, so it seems like the data must exist somewhere. I mean I doubt that every time we run the Analytics View AzDO is recalculating everything.
I do see the "snapshot" tables which mirror that information using OData, however those are restricted to aggregation queries only. So the data is present.
The issue with pulling it into a Power BI set is that when merging with dataflows i sometimes get those "reference" errors - and also that once it's in a Power BI semantic model in the Premium space, I have to use that in it's entirety.
But your solution is the same one I've had to come up with and it's lacking, but if it's the best we can do, then I'll chock it up to yet another Microsoft mistep with Azure DevOps.
Hi @StephenRabobank -If you need daily granularity for work items, you will likely need to create an external process to extract and store data daily since Azure DevOps doesn’t provide this natively.If the Analytics Views OData feed meets your needs, use it to avoid complex ETL pipelines.If you're using Premium capacity, focus on external staging solutions (e.g., SQL Database or Azure Data Lake) to store snapshots and integrate with Power BI.
Using Power BI Desktop to connect to the data (via REST API or Analytics Views), then publish the dataset to the Power BI Service or Premium capacity.
Or, export and transform the data externally before pushing it to the Dataflow.
Proud to be a Super User! | |
Check out the July 2025 Power BI update to learn about new features.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
70 | |
68 | |
43 | |
34 | |
26 |
User | Count |
---|---|
88 | |
52 | |
45 | |
39 | |
38 |