Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Score big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount

Reply
mikebrodt
Frequent Visitor

Modeling Data over Time

Hi Everyone,

 

I have a bit of a data model challenge that I would love some recommendations to help resolve. I am reporting off a SQL Table that has a list of files and the name of the server they are on. I need to report on the count of files per server per day. The challenge is that the SQL Table gets wiped and updated daily with that day's report. 

 

I do have a table with a list of all the servers already, and I am able to do the calculations. The question is if I can store those values after the SQL table is wiped, and how I would associate those numbers with a date. If I have that, I know how show the changes over time.

 

Thank you!

1 ACCEPTED SOLUTION
bcdobbs
Community Champion
Community Champion

Have a look at the two example files in this folder.

 

Example 1 is a very basic example of how your snapshotted SQL table could look with a custom date table related to it in order to enable future analysis.

 

Example 2 then uses power query to shape the data into a better star schema which will keep the model faster as your data volume grows. This data transformation could be done at source in SQL Server using SSIS for example.

 

Hope they show how it could look.



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

View solution in original post

5 REPLIES 5
bcdobbs
Community Champion
Community Champion

Are you able to create a new table in your sql database? I'd schedule an insert statement to run each day to insert that days rows and stamp them with that days date.

 

Depending on the level of detail you could use a group by to pre aggregate your snapshot.

 

Then in power bi have a relationship to a date table and you should be could to go.

 

There are other solutions involving incremental refresh but snapshotting in sql is easier and more resilient in my opinion.



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

If I do the insert, I would wind up with duplicate names in the rows.

 

I apologize, I should have provided a bit more information. Here is what the table that gets wiped every day looks like:

 

Server NameClusterPoolIPFile Name
Server-001Cluster 1Pool 1192.168.1.4C:\Program Files\...
Server-001Cluster 1Pool 1192.168.1.4C:\Program Files\...
Server-002Cluster 2Pool 1192.168.1.46C:\Program Files\...
Server-003Cluster 1Pool 2192.168.1.14C:\Program Files\...

 

I suppose I could make a table that summarizes this table every day with a new column per date. But in terms of capturing the details of this table, I am uncertain how to structure the data model to save this information properly. Specifically, if I wanted to show how Server-001 had two files yesterday, and one file today, and maintain the specific information about the file path.

 

Is this making any sense?

Yes totally making sense. 

You're worrying to much about duplicate names though. You want the same table structure but with a date column that gets populated when you do the insert. You could then build a clustered key based on date, server and filename.

 

When you load that into power bi you should then be able to get the information you need. I'll try and mock something up later this evening.



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!
bcdobbs
Community Champion
Community Champion

Have a look at the two example files in this folder.

 

Example 1 is a very basic example of how your snapshotted SQL table could look with a custom date table related to it in order to enable future analysis.

 

Example 2 then uses power query to shape the data into a better star schema which will keep the model faster as your data volume grows. This data transformation could be done at source in SQL Server using SSIS for example.

 

Hope they show how it could look.



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Thank you very much! That makes a lot of sense. Greatly appreciate your time with this!

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors