Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Vinxsta
Frequent Visitor

Help with creating and storing data snapshots from Microsoft Intune

I currently use Power BI to connect to Microsoft Intune via an Odata feed. As it stands, the data I see in Power BI is what is currently in Intune and this changes daily. What I want to do is firstly, what is the best way I can create weekly snapshots of this data for trending and secondly, what is the best way to store these snapshots i.e. blob containers.

 

I've exhausted all options I'm aware of and I don't know what I don't know. So I'm hoping someone would be able to give me some advice on how best I can do this.

 

Many thanks in advance!!

1 ACCEPTED SOLUTION
burakkaragoz
Community Champion
Community Champion

Hi @Vinxsta ,

 

You're right that the OData feed from Intune only gives you the current state of the data, so for historical tracking you'll need to implement your own snapshot logic.

One common approach is to use a scheduled pipeline (like in Azure Data Factory or a Logic App) that pulls the data from the OData feed on a weekly basis and stores it in a blob container or a data lake. You can store each snapshot as a separate file (e.g. JSON or CSV) with a timestamp in the filename.

Then, in Power BI, you can connect to that storage and build a model that reads all the snapshots and lets you do trend analysis over time.

Another option is using a Fabric Dataflow Gen2 to pull the data and land it into a Lakehouse table weekly. That way you can keep everything inside the Fabric ecosystem.

Hope that helps get you started.

If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
translation and formatting supported by AI

View solution in original post

8 REPLIES 8
Blue407
Frequent Visitor

What solution did you go for in the end?

 

I have almost the exact requirement and I am trying to figure out the best way to tackle it.

I'm wondering if I should create a new table with a row containing the compliance data values and a date. Somehow automate a job to add the current data values on 1st of each month, then create a report/dashboard using that.

 

I only need to store the totals, not compliance for each device.

Hi @Blue407, I ended up connecting to my odata feed using Microsoft Fabric and setting up a scheduled flow. I append each weeks' data into one table with a new column to recognise the snapshot date. This works fine for me. Hope this helps you.

Vinxsta
Frequent Visitor

Apologies for the late response. Thank you for your response @burakkaragoz . I will look into this further and see which one is the better option

No worries at all, glad I could help out! If you have any other questions or need more details while you’re comparing options, just let me know. Good luck with your decision!
translation and formatting supported by AI

v-echaithra
Community Support
Community Support

Hi @Vinxsta ,

As we haven’t heard back from you, so just following up to our previous message. I'd like to confirm if you've successfully resolved this issue or if you need further help.

If yes, you are welcome to share your workaround and mark it as a solution so that other users can benefit as well. If you find a reply particularly helpful to you, you can also mark it as a solution.
If you still have any questions or need more support, please feel free to let us know. We are more than happy to continue to help you.

Thank you for your patience and look forward to hearing from you.
Best Regards,
Chaithra E.

v-echaithra
Community Support
Community Support

Hi @Vinxsta ,

We wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Regards,
Chaithra.

v-echaithra
Community Support
Community Support

Hi @Vinxsta ,

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Regards,

Chaithra.

burakkaragoz
Community Champion
Community Champion

Hi @Vinxsta ,

 

You're right that the OData feed from Intune only gives you the current state of the data, so for historical tracking you'll need to implement your own snapshot logic.

One common approach is to use a scheduled pipeline (like in Azure Data Factory or a Logic App) that pulls the data from the OData feed on a weekly basis and stores it in a blob container or a data lake. You can store each snapshot as a separate file (e.g. JSON or CSV) with a timestamp in the filename.

Then, in Power BI, you can connect to that storage and build a model that reads all the snapshots and lets you do trend analysis over time.

Another option is using a Fabric Dataflow Gen2 to pull the data and land it into a Lakehouse table weekly. That way you can keep everything inside the Fabric ecosystem.

Hope that helps get you started.

If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
translation and formatting supported by AI

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.