The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi all,
Sorry - I'm sure this is incredibly simple but I am struggling with the best way to set up my data model so that I can get proper insights. Hope the below makes sense, let me know if anything needs clarifying.
I have received a sample dataset from our Product team, attached (I've anonymised it as I can't share the original file). Basically it is a time series showing the inventory held for a number of household products held in each store. Each row is a single day's worth of data per store and product. Also on each row is the total inventory across the industry on that day for that product.
I've tried using something like the STAR method for constructing the data model (separating out the Dimensional data from the Fact data), but the visuals actually seemed to work better when I just left all the data in a single source and created charts / graphs over that single source.
I guess I just wanted to see how others (far more experienced than me!) would structure this in Power Query / use this data in Power BI to create better visuals, especially given the time series nature of the data. Sorry if this is a bit vague, am still very new to Power BI !
Thanks
DC
Solved! Go to Solution.
Hi @DC2024 ,
You can create a separate dimension table for stores, products and date.
For example:
Fact Table:
Product dimension table:
Store dimension table:
Date dimension table:
DateTable = ADDCOLUMNS(
CALENDAR(MIN('Fact Table'[Date]),MAX('Fact Table'[Date])),
"Year",YEAR([Date]),
"Quarter",ROUNDUP(MONTH([Date])/3,0),
"Month",MONTH([Date]),
"Day",DAY([Date])
)
Create relationships between them. This is key to a functional STAR schema.
pbix file is attached.
If you have any further questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @DC2024 ,
You can create a separate dimension table for stores, products and date.
For example:
Fact Table:
Product dimension table:
Store dimension table:
Date dimension table:
DateTable = ADDCOLUMNS(
CALENDAR(MIN('Fact Table'[Date]),MAX('Fact Table'[Date])),
"Year",YEAR([Date]),
"Quarter",ROUNDUP(MONTH([Date])/3,0),
"Month",MONTH([Date]),
"Day",DAY([Date])
)
Create relationships between them. This is key to a functional STAR schema.
pbix file is attached.
If you have any further questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
That is fantastic, thank you very much Yang!
Sorry - doesnt seem to have let me attach the file! Here is a sample of the data:
Date | Product ID | Product Name | Total Product Inventory (Industry) | Store Location | Inventory (Store) | % Inventory Store vs. Industry | Store Inventory Id |
13/02/2024 | 7019466 | Bin | 5106000 | Leeds | 1153 | 0.022581 | 6045167 |
14/02/2024 | 7019466 | Bin | 5106000 | Leeds | 1153 | 0.022581 | 6045167 |
15/02/2024 | 7019466 | Bin | 5106000 | Leeds | 1153 | 0.022581 | 6045167 |
16/02/2024 | 7019466 | Bin | 5106000 | Leeds | 1153 | 0.022581 | 6045167 |
13/02/2024 | 7000211 | Hoover | 78560000 | London | 18000000 | 22.912423 | 6001065 |
14/02/2024 | 7000211 | Hoover | 78560000 | London | 18000000 | 22.912423 | 6001065 |
15/02/2024 | 7000211 | Hoover | 78560000 | London | 18000000 | 22.912423 | 6001065 |
16/02/2024 | 7000211 | Hoover | 78560000 | London | 18000000 | 22.912423 | 6001065 |
16/02/2024 | 7000486 | Duster | 2989454000 | Manchester | 1625186 | 0.054363 | 6001910 |
16/02/2024 | 7000486 | Duster | 2989454000 | Southampton | 2036603 | 0.068126 | 6001911 |
15/02/2024 | 7000486 | Duster | 2989454000 | Manchester | 1625186 | 0.054363 | 6001910 |
15/02/2024 | 7000486 | Duster | 2989454000 | Southampton | 2036603 | 0.068126 | 6001911 |
14/02/2024 | 7000486 | Duster | 2989454000 | Manchester | 1625186 | 0.054363 | 6001910 |
14/02/2024 | 7000486 | Duster | 2989454000 | Southampton | 2036603 | 0.068126 | 6001911 |
13/02/2024 | 7000486 | Duster | 2989454000 | Manchester | 1625186 | 0.054363 | 6001910 |
13/02/2024 | 7000486 | Duster | 2989454000 | Southampton | 2036603 | 0.068126 | 6001911 |
13/02/2024 | 7011835 | Sponge | 89483000 | Portsmouth | 117989 | 0.131856 | 6412733 |
13/02/2024 | 7011835 | Sponge | 89483000 | Brighton | 1000791 | 1.118414 | 6412734 |
13/02/2024 | 7011835 | Sponge | 89483000 | Newcastle | 98249 | 0.109796 | 6412735 |
13/02/2024 | 7011835 | Sponge | 89483000 | London | 1312844 | 1.467143 | 6412736 |
User | Count |
---|---|
86 | |
84 | |
36 | |
34 | |
34 |
User | Count |
---|---|
94 | |
79 | |
65 | |
55 | |
52 |