Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hi all,
I'm new to Power Query and looking for a solution to solve a small issue I have with my report.
I'm looking into sales data. My goal to is to get data from two different sources (Selection of weekly products & Daily Sales) - I do it by automatically import data from given folders - and create a moving dashboard on excel with powerpivot.
My two sources are built like this
Selection of weekly products
WEEK | Upload date | Product No. | Brand | Category |
32 | 09.08.2020 | P001 | Gucci | shoes |
32 | 09.08.2020 | P002 | Saint Laurent | bags |
32 | 09.08.2020 | P003 | Gucci | clothing |
WEEK | Upload date | Product No. | Brand | Category |
33 | 16.08.2020 | P009 | Gucci | shoes |
33 | 16.08.2020 | P010 | Valentino | bags |
33 | 16.08.2020 | P011 | Gucci | clothing |
Daily Sales
Sale Date | No. | QTY | Revenue |
10.08.2020 | P001 | 1 | 100 |
10.08.2020 | P003 | 4 | 250 |
Sale Date | No. | QTY | Revenue |
16.08.2020 | P001 | 2 | 200 |
16.08.2020 | P010 | 3 | 600 |
What I want to achieve on my pivot in excel - by creating connections - is that when I filter by i.e. WEEK 32 I get sale numbers only from that week and not from the next weeks (since the product will still be sold even after those days - but I'm only interested in the sales of the first week). The problem is occurring mainly due to the fact that I'm not able to create bridge tables because from one source I get the Weeknum and from the other one I get the Sale Date.
Please let me know if something is unclear. I thank you in advace for your help!
Best,
Lorenzo
let me know if what follows is close to your goal:
query Product:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjZQ0lEKMDAwBFKJiUAiGcQyMtM3MNc3MjAyUIrVASoyhCgyAlJJSSBFIJaBkb6BBaYiYxSTcCgyAckn41JkBFFkCrcOZKaBJVZFZnDr0BXFAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Week = _t, Product = _t, Brand = _t, Category = _t, #"Upload date" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Week", Int64.Type}, {"Product", type text}, {"Brand", type text}, {"Category", type text}, {"Upload date", type date}},"it-IT"),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Qty", each List.Sum(Table.SelectRows(sales, (r)=> (r[wk]=[Week] and r[Product]=[Product] ))[QTY])),
#"Added Custom 1" = Table.AddColumn(#"Added Custom", "rev", each List.Sum(Table.SelectRows(sales, (r)=> (r[wk]=[Week] and r[Product]=[Product] ))[Revenue]))
in #"Added Custom 1"
query sales:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCjAwMFTSUQJjAwMgaWSmb2Cub2RgZKAUqwOXNwJhiLw5NnkTEAbLGxjrG1igyBtD9RuD5Q0N0OXNoPabokvHAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Product = _t, QTY = _t, Revenue = _t, #"Sale Date" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{"Sale Date", type date}),
#"Changed Type2" = Table.TransformColumnTypes(#"Changed Type",{{"QTY", Int64.Type}, {"Revenue", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type2", "wk", each Date.WeekOfYear([Sale Date],2)),
#"Changed Type1" = Table.TransformColumnTypes(#"Added Custom",{{"wk", Int64.Type}})
in
#"Changed Type1"
with the added auxiliar column wk
The wk column was "forcibly" obtained from the "sale dates" column in order to have the same result for 7/26 and 7/27, which could be two days of different weeks, according to other conventions.
e questo è il risultato:
Hi @Anonymous
Thanks for your support and sorry for my late reply.
I think this may actually work, but do you have any idea why it's extremely slow to upload and refresh? Despite the fact I don't have a huge amount data to study, it's still taking hours to refresh every time.
Thanks a lot!
Lorenzo
<<... think this may actually work, but do you have any idea why it's extremely slow to upload and refresh? Despite the fact I don't have a huge amount data to study, it's still taking hours to refresh every time. ..>>
# "Added Custom" = Table.AggregateTableColumn (Table.AddColumn (# "Changed Type", "QR", each Table.SelectRows (sales, (r) => (r [wk] = [Week] and r [Product] = [Product])) [[QTY], [Revenue]]), "QR", {{"QTY", List.Sum, "sumQ"}, {"Revenue", List.Sum, "sumR"}} )
Hi @Anonymous
Thanks for the support!
Unfortunately none or the above suggestions is drastically speeding my performance up.
As I said above, I have 1 Product List file adding every week (only 50 items/rows a week max) and 1 Sales file adding every single day - 5 days a week - (around 300 rows every single day). This is the size of my data.
If for example, I take into consideration one month, I may have 200 rows for the Products Lists and around 7.000 for sales.
Is this kinda of clear for you now?
Thank you!
I'm sorry but I don't remember the details of the subject anymore.
Now I don't have time to resume and deepen.
I will do it as soon as I can, as I understand it is not an urgent thing.
Recycling Lists: 50 rows a week
let
Source = Folder.Files("\\mytheresa.local\Files\Public\MERCHANDISING RETAIL\RECYCLING\KW\Recycling Lists"),
#"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
#"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from Recycling Lists", each #"Transform File from Recycling Lists"([Content])),
#"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
#"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from Recycling Lists"}),
#"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Recycling Lists", Table.ColumnNames(#"Transform File from Recycling Lists"(#"Sample File"))),
#"Added Custom" = Table.AddColumn(#"Expanded Table Column1", "Week", each Date.WeekOfYear([Latest Recycling Date])),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Source.Name"}),
#"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"No.", "No"}}),
#"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"Core Item", type text}, {"Brand", type text}, {"No", type text}, {"Main Waregroup", type text}, {"Item Category", type text}, {"Latest Recycling Date", type date}, {"Week", Int64.Type}}),
#"Added Custom1" = Table.AggregateTableColumn (Table.AddColumn (#"Changed Type", "QR", each Table.SelectRows (DOR, (r) => (r [Date] = [Latest Recycling Date] and r [No] = [No])) [[QTY], [Revenue Net]]), "QR", {{"QTY", List.Sum, "sumQ"}, {"Revenue", List.Sum, "sumR"}} )
in
#"Added Custom1"
Recycling Lists: 300 rows a day
let
Source = Folder.Files("\\mytheresa.local\Files\Public\MERCHANDISING RETAIL\RECYCLING\KW\DOR"),
#"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
#"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from DOR", each #"Transform File from DOR"([Content])),
#"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
#"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from DOR"}),
#"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from DOR", Table.ColumnNames(#"Transform File from DOR"(#"Sample File (2)"))),
#"Added Custom" = Table.AddColumn(#"Expanded Table Column1", "Date", each Text.Range([Source.Name],0,8)),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom",{{"Date", type date}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Source.Name", "Style", "Item Brand", "Main Waregroup", "Department Group", "Onlineset", "For Sale Date", "Season", "Is Core Item", "Is Exclusive", "Colour", "sale price", "Qty Discount", "Revenue Discount"})
in
#"Removed Columns"
a different way, grouping the sales table
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCjAwMFTSUQJjAwMgaWSmb2Cub2RgZKAUqwOXNwJhiLw5NnkTEAbLGxjrG1igyBtD9RuD5Q0N0OXNoPabokvHAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Product = _t, QTY = _t, Revenue = _t, #"Sale Date" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Sale Date", type date},{"QTY", Int64.Type}, {"Revenue", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "wk", each Date.WeekOfYear([Sale Date],2), type number),
#"Grouped Rows" = Table.Group(#"Added Custom", {"Product","wk"}, {{"all", each _, type table [Product=text, QTY=nullable number, Revenue=nullable number, Sale Date=nullable date]}})
in
#"Grouped Rows"
then doing research on subgroups should be faster
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("dc1BDoAgDATAv/RMwlIQ9RfeCQfg/3+wDUqU6KXbpJNtSuRBhg7ASZQio+nG0WK1DAZlI8h1xBK1KtINbLE9EHfkX01uQldT0Hu70U/TMt5pJ/ZPFMe7GeUT", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Week = _t, Product = _t, Brand = _t, Category = _t, #"Upload date" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Week", Int64.Type}, {"Product", type text}, {"Brand", type text}, {"Category", type text}, {"Upload date", type date}},"it-IT"),
ac1= Table.ExpandListColumn(Table.AddColumn( #"Changed Type", "QR", each List.Select(Table.SelectRows(salesGrp, (r)=> (r[Product]=[Product]))[all], (l)=>l[wk]{0}=[Week] )),"QR"),
tac= Table.AggregateTableColumn(ac1, "QR",{{"QTY",List.Sum,"sumQ"},{"Revenue",List.Sum,"sumR"}})
in
tac
but to test and possibly improve the speed of execution we need the details (only the structur and sizes, of course)of the real dataset you are working on
I'm sorry, but I no longer remember the details of this problem.
Since seems that there is a performance problem, to try to find a useful solution you should also provide the dimensional data or better the tables (even with fake data, but with the same structure as your real data) on which to reason to try any changes to make the algorithm more efficient.
peraphs a merge could be more performant:
let
Source = Table.NestedJoin(productm, {"Product", "Week"}, sales, {"Product", "wk"}, "product.1", JoinKind.FullOuter),
#"Aggregated product.1" = Table.AggregateTableColumn(Source, "product.1", {{"QTY", List.Sum, "Sum of product.1.QTY"}, {"Revenue", List.Sum, "Sum of product.1.Revenue"}})
in
#"Aggregated product.1"
this is the same sales table
Hi @Anonymous ,
if my understanding is correct, you can significantly speed up your first query using this method: https://www.thebiccountant.com/2017/05/29/performance-tip-partition-tables-crossjoins-possible-powerquery-powerbi/
With regards to your second suggestion, I'd expect it to be very slow as well. Here I've described how to speed it up: https://www.thebiccountant.com/2019/10/28/performance-tip-for-aggregations-after-joins-in-power-query-and-power-bi/
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
Hi @Anonymous
As explained above, here the two sources I have and the relative queries
Proudcts List (1x week)
No. | Brand Description | Description | Main Waregroup Description | Pre Shopping Date |
P00472852 | Paco Rabanne | shoulder bags | bags | 05.07.2020 |
P00478613 | Zimmermann | jumpsuits | clothing | 05.07.2020 |
P00478618 | Zimmermann | dresses | clothing | 05.07.2020 |
P00481172 | Bottega Veneta | pumps | shoes | 05.07.2020 |
P00481332 | Bottega Veneta | high sandals | shoes | 05.07.2020 |
P00482376 | Dries van Noten | high sandals | shoes | 05.07.2020 |
P00482532 | Bottega Veneta | sunglasses | accessoires | 05.07.2020 |
P00483132 | Bottega Veneta | clutches | bags | 05.07.2020 |
P00483185 | Bottega Veneta | shoulder bags | bags | 05.07.2020 |
P00483230 | Off-White | totes | bags | 05.07.2020 |
P00483529 | Saint Laurent | belts | accessoires | 05.07.2020 |
P00483897 | Off-White | skirts | clothing | 05.07.2020 |
P00483908 | Off-White | jackets | clothing | 05.07.2020 |
P00484251 | Balmain | tops | clothing | 05.07.2020 |
P00484294 | Balmain | skirts | clothing | 05.07.2020 |
P00487097 | Alexandre Vauthier | skirts | clothing | 05.07.2020 |
P00487691 | Loewe | pumps | shoes | 05.07.2020 |
P00487799 | Bottega Veneta | pants | clothing | 05.07.2020 |
P00487823 | Bottega Veneta | dresses | clothing | 05.07.2020 |
P00487955 | Gucci | hats/hair accessories | accessoires | 05.07.2020 |
P00487956 | Gucci | hats/hair accessories | accessoires | 05.07.2020 |
P00487957 | Gucci | hats/hair accessories | accessoires | 05.07.2020 |
P00488001 | Gucci | shoulder bags | bags | 05.07.2020 |
P00488352 | Valentino | tops | clothing | 05.07.2020 |
P00488545 | Loewe | jackets | clothing | 05.07.2020 |
P00488559 | Tom Ford | pumps | shoes | 05.07.2020 |
P00488571 | Tom Ford | high sandals | shoes | 05.07.2020 |
P00488591 | Loewe | jackets | clothing | 05.07.2020 |
P00488670 | Loewe | tops | clothing | 05.07.2020 |
P00488753 | Valentino | pants | clothing | 05.07.2020 |
P00489313 | Givenchy | pumps | shoes | 05.07.2020 |
P00489827 | Saint Laurent | shorts | clothing | 05.07.2020 |
P00489853 | Saint Laurent | capes/ponchos | clothing | 05.07.2020 |
P00489868 | Saint Laurent | dresses | clothing | 05.07.2020 |
P00490811 | Stella McCartney | ankle boots | shoes | 05.07.2020 |
P00491849 | Fendi | dresses | clothing | 05.07.2020 |
P00496045 | Givenchy | dresses | clothing | 05.07.2020 |
P00496717 | Gucci | tops | clothing | 05.07.2020 |
P00496786 | Dolce & Gabbana | dresses | clothing | 05.07.2020 |
P00496787 | Dolce & Gabbana | dresses | clothing | 05.07.2020 |
P00497283 | Rick Owens | dresses | clothing | 05.07.2020 |
P00497853 | Balmain | swimwear | clothing | 05.07.2020 |
P00498363 | Gucci | sweatshirts | clothing | 05.07.2020 |
P00499108 | Costarellos | cocktail/gowns | clothing | 05.07.2020 |
P00499296 | Chloé | dresses | clothing | 05.07.2020 |
P00499632 | The Attico | dresses | clothing | 05.07.2020 |
P00499638 | The Attico | pants | clothing | 05.07.2020 |
P00499715 | Paco Rabanne | dresses | clothing | 05.07.2020 |
P00491734 | Fendi | jackets | clothing | 05.07.2020 |
P00499746 | Paco Rabanne | casual tops | clothing | 05.07.2020 |
let
Source = Folder.Files("T:\Public\MERCHANDISING RETAIL\Pre-Shopping\Pre-Shopping Lists"),
#"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
#"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from Pre-Shopping Lists", each #"Transform File from Pre-Shopping Lists"([Content])),
#"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
#"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from Pre-Shopping Lists"}),
#"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Pre-Shopping Lists", Table.ColumnNames(#"Transform File from Pre-Shopping Lists"(#"Sample File"))),
#"Added Custom" = Table.AddColumn(#"Expanded Table Column1", "Week", each Date.WeekOfYear([Pre Shopping Date])),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Source.Name"}),
#"Reordered Columns" = Table.ReorderColumns(#"Removed Columns",{"Week", "No.", "Brand Description", "Description", "Main Waregroup Description"}),
#"Renamed Columns" = Table.RenameColumns(#"Reordered Columns",{{"No.", "No"}}),
#"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"Pre Shopping Date", type date}, {"Week", Int64.Type}, {"No", type text}, {"Brand Description", type text}, {"Description", type text}, {"Main Waregroup Description", type text}}),
#"Added Custom1" = Table.AddColumn(#"Changed Type", "Qty", each List.Sum(Table.SelectRows(DOR, (r)=> (r[Week]=[Week] and r[No]=[No] ))[QTY])),
#"Added Custom2" = Table.AddColumn(#"Added Custom1", "Revenue", each List.Sum(Table.SelectRows(DOR, (r)=> (r[Week]=[Week] and r[No]=[No] ))[Revenue Net]))
in
#"Added Custom2"
Daily Order Report (1x Day) avg. 5.000 rows
Description | Style | Item Brand | Main Waregroup | Department Group | Onlineset | For Sale Date | Season | Is Core Item | Is Exclusive | Colour | QTY | sale price | Revenue Net | Qty Discount | Revenue Discount |
P00394784 belts | Unassigned | Off-White | accessories | WOMENSWEAR | 2000_03_10 | 8/8/2019 | 20/1 | No | No | 6010 - YELLOW BLACK | 30 | 86 | 2.340 € | 30 | 2.340 € |
P00435551 sneakers | Unassigned | Stella McCartney | shoes | WOMENSWEAR | 08 | 11/29/2019 | 20/1 | No | No | 9099 - WH/MULTI | 18 | 175 | 2.704 € | 18 | 2.704 € |
P00440255 hats/hair accessories | Unassigned | Jacquemus | accessories | WOMENSWEAR | 2020_03_16 | 3/13/2020 | 20/1 | No | No | NATURAL - NATURAL | 13 | 227 | 2.019 € | 13 | 2.019 € |
P00445233 shoulder bags | Unassigned | Loewe | bags | MENSWEAR | 2020_06_01 | 1/15/2020 | 20/1 | No | No | 2530 - BROWN | 12 | 175 | 1.507 € | 12 | 1.507 € |
P00422869 belts | Unassigned | Off-White | accessories | WOMENSWEAR | 15 | 12/9/2019 | 20/1 | No | Yes | 0110 - WHITE BASE/BLACK WRITING | 11 | 107 | 792 € | 11 | 792 € |
P00494290 espadrilles | Unassigned | Castañer | shoes | WOMENSWEAR | 2020_06_09 | 6/8/2020 | 20/2 | No | No | 100 - NEGRO | 10 | 105 | 888 € | ||
P00394419 totes | Unassigned | Off-White | bags | WOMENSWEAR | 2000_03_10 | 5/29/2019 | 20/1 | No | No | 1001 - BLACK WHITE | 9 | 234 | 1.564 € | 9 | 1.564 € |
P00452928 sneakers | Unassigned | Acne Studios | shoes | MENSWEAR | 2020_06_10 | 3/17/2020 | 20/1 | No | No | IVORY WHIT - IVORY WHITE | 9 | 156 | 902 € | 9 | 902 € |
P00450812 shoulder bags | Unassigned | Jil Sander | bags | WOMENSWEAR | 2020_02_28 | 4/1/2020 | 20/1 | No | No | 680 - CANDY | 9 | 330 | 1.802 € | 9 | 1.802 € |
let
Source = Folder.Files("\\mytheresa.local\Files\Public\MERCHANDISING RETAIL\RECYCLING\KW\DOR"),
#"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
#"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from DOR", each #"Transform File from DOR"([Content])),
#"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
#"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from DOR"}),
#"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from DOR", Table.ColumnNames(#"Transform File from DOR"(#"Sample File (2)"))),
#"Added Custom" = Table.AddColumn(#"Expanded Table Column1", "Date", each Text.Range([Source.Name],0,8)),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom",{{"Date", type date}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Source.Name", "Style", "Item Brand", "Main Waregroup", "Department Group", "Onlineset", "For Sale Date", "Season", "Is Core Item", "Is Exclusive", "Colour", "sale price", "Qty Discount", "Revenue Discount"})
in
#"Removed Columns"
Thank you!
Lorenzo
HI @LorenzoMolteni ,
Try creating a relationshipp between your 2 tables on Product No.
Thanks,
Pragati
Hi @Pragati11
That relationship is already created but doesn't help to solve the above metioned issue on WeekNum and Date.
Best,
Lorenzo
I do not think you need a bridge table. You need a Date table, with at least 2 fields. (Should have a few dozen, but these two are critical for you.)
Date.Year([Date]) * 100 + Date.WeekOfYear([Date])
You need to add a new Week/Year column to your table that only has the week number. You need a date, or you can use this formula, though it is a bit of a hard-code.
2020 * 100 + Date.WeekOfYear([Date])
Then use the date or week or year/week field from your date table, and relate the Date[Date] field to your sales date, and the Date[Year Week] to your Year/Week num field in the other table.
Here is a fairly comprehensive date table with this info, and below a link to show you how to make it dynamic.
let
Source = {Number.From(#date(2020,1,1))..Number.From(#date(2020,12,31))},
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), {"Date"}, null, ExtraValues.Error),
#"Changed Type" = Table.TransformColumnTypes(#"Converted to Table",{{"Date", type date}}),
#"Inserted Year" = Table.AddColumn(#"Changed Type", "Year", each Date.Year([Date]), Int64.Type),
#"Inserted Month" = Table.AddColumn(#"Inserted Year", "Month", each Date.Month([Date]), Int64.Type),
#"Inserted Month Name" = Table.AddColumn(#"Inserted Month", "Month Name", each Date.MonthName([Date]), type text),
#"Added Short Month Name" = Table.AddColumn(#"Inserted Month Name", "Short Month Name", each Text.Start([Month Name],3), type text),
#"Added Month Year" = Table.AddColumn(#"Added Short Month Name", "Month Year", each [Short Month Name] & " " & Text.From([Year]), type text),
#"Added Month Year Sort" = Table.AddColumn(#"Added Month Year", "Month Year Sort", each [Year]*100 + [Month], Int64.Type),
#"Inserted Quarter Number" = Table.AddColumn(#"Added Month Year Sort", "Quarter Number", each Date.QuarterOfYear([Date]), Int64.Type),
#"Inserted Quarter" = Table.AddColumn( #"Inserted Quarter Number","Quarter", each "Qtr " & Text.From([Quarter Number]), type text),
#"Inserted Quarter Year Sort" = Table.AddColumn(#"Inserted Quarter", "Quarter Year Sort", each [Year] * 10 + [Quarter Number], Int64.Type),
#"Inserted Quarter Year" = Table.AddColumn(#"Inserted Quarter Year Sort", "Quarter Year", each "Q" & Text.End([Quarter],1) & " " & Text.From([Year]), type text),
#"Inserted Week of Year" = Table.AddColumn(#"Inserted Quarter Year", "Week of Year", each Date.WeekOfYear([Date]), Int64.Type),
#"Inserted Week of Month" = Table.AddColumn(#"Inserted Week of Year", "Week of Month", each Date.WeekOfMonth([Date]), Int64.Type),
#"Inserted Day" = Table.AddColumn(#"Inserted Week of Month", "Day", each Date.Day([Date]), Int64.Type),
#"Inserted Day of Week" = Table.AddColumn(#"Inserted Day", "Day of Week", each Date.DayOfWeek([Date]), Int64.Type),
#"Inserted Day of Year" = Table.AddColumn(#"Inserted Day of Week", "Day of Year", each Date.DayOfYear([Date]), Int64.Type),
#"Inserted Day Name" = Table.AddColumn(#"Inserted Day of Year", "Day Name", each Date.DayOfWeekName([Date]), type text),
#"Added IsFuture Boolean" = Table.AddColumn(#"Inserted Day Name", "IsFuture", each [Date] > DateTime.Date(DateTime.LocalNow()), type logical),
#"Added IsInCurrentWeek" = Table.AddColumn(#"Added IsFuture Boolean", "IsInCurrentWeek", each Date.IsInCurrentWeek([Date]), type logical),
#"Added IsInCurrentMonth" = Table.AddColumn(#"Added IsInCurrentWeek", "IsInCurrentMonth", each Date.IsInCurrentMonth([Date]), type logical),
#"Added IsInCurrentQuarter" = Table.AddColumn(#"Added IsInCurrentMonth", "IsInCurrentQuarter", each Date.IsInCurrentQuarter([Date]), type logical),
#"Added IsInCurrentYear" = Table.AddColumn(#"Added IsInCurrentQuarter","IsInCurrentYear", each Date.IsInCurrentYear([Date]), type logical),
#"Added Year Week" = Table.AddColumn(#"Added IsInCurrentYear", "Year Week", each Date.Year([Date]) * 100 + Date.WeekOfYear([Date]), Int64.Type)
in
#"Added Year Week"
How to use M code provided in a blank query:
1) In Power Query, select New Source, then Blank Query
2) On the Home ribbon, select "Advanced Editor" button
3) Remove everything you see, then paste the M code I've given you in that box.
4) Press Done
5) See this article if you need help using this M code in your model.
Creating a Dynamic Date Table in Power Query
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingHi @edhans
Thanks for your help with this. Really much appreciated.
What I still can't understand is how i can relate Date[Year Week] to the new Year/Week num in my source table since each column contains duplicate value and PowerPivot doesn't allow me to do that.
How can I overcome this?
Thank you!
Lorenzo
ACK! You are correct. That is many to many. You'll need to convert that week number to a date. This will do that:
List.Min(
{
#date(2020+1,1,1),
Date.AddDays(
#date(2020,1,1),
-Date.DayOfWeek(
#date(2020,1,1),
Day.Monday
)+[Week of Year] * 7
)
}
)
You'll need to play with it and make sure it conforms to your week. My example starts on a Monday. You can change at at to Day.Sunday, Day.Wednesday, or whatever.
Then connect the Date[Date] field (unique) to this new pseudoweek day field in your sales table.
And I've hardcoded the 2020. You should put in a valid year field so it doesn't stop working in 4 months.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingHi @edhans
Thank you! Unfortunately another problem is occurring right now, meaning that I'm not able anymore to build the relationship between my two sources (products lists and sales numbers). It remains inactive due to other relationships between sale and date.
Please see the screenshoot. I don't know how to solve this...
(Take everything you see here below as a try)
I'm not 100% clear on your tables. I know what is causing this inactive relationship though.
DIM tables are dimension tables and should describe your data. So a date table (describes dates), customer table (describes customers), product table (describes products.)
Your FACT tables are facts, or transactions, so sales, purchase, orders, etc.
FACT tables should not be related to each other, and DIM tables should not be related to each other.
You've related what looks to me like your DIM Prodcut Table to your DIM Date table. Why are you doing that? In other words, what date relationship do you care about that is between the dates in the date table and your product table?
I also don't know what your item table and product tables are. That sounds like the same thing to me. Can those be merged into a single table? I know I know almost nothing about your data, but to me, this should be 3 tables:
DIM Products
DIM Date
FACT Sales
The DIMs should be related 1 to many to the Sales table.
Post back with more info. You are getting deep into data modeling here, which is how you ultimately build a successful DAX model!
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingHi @edhans
Thanks for your reply!
First of all, no need for the Items tabe. That was just a try I was giving to solve the issue.
As you just said, there should be 3 tables in Power Pivot:
- Products list: This is a selection of products that gets uploaded and goes on sale. I have a new selection/list once a week (uploading on Sunday). I get this from Power Query (import from folder). This contains the Product No., Brand, Pre-shopping date (date of upload), Week and now the Fake Date you suggested to create. I want to study the sales on these products on their first week only being live (i.e. if I filter for CW16 I only want to see sale performence for the products I uploaded in CW16 during their first week - not taking into account sales data from CW17 on)
- Sale: This is basically the daily order report for all products we sell. I also get this from Power Query (import from folder). We have here different data, but the ones we need to definetely have are Product No., QTY, Revenue, Date (sale date).
Date: This is simply the date table you suggested to create with Date and Year/Week.
Given the above, I assume I have to create a connection between:
- Product List & Sales on Product No.
- Date & Sales on Date
- Product List & Date on Fake Date / Date
I may have misunderstood your first advices, but not sure of that. How shall I overcome the issue?
Please find attached how the final report looks like. My report occurs when filtering for more/different weeks.
Thanks so much!
Lorenzo
Final Report - error occurs when filtering by weeks
Why not create a true/false field in the product table that determines if this is the first week? Date.IsInCurrentWeek() will do it based on today (refresh date) but you can tweak that or calculate your own week number in there. I don't think there is a need to relate that to the date table specifially. Then just add a filter in your measures for this field.
New Sales =
CALCULATE(
SUM(Sales),
FILTER(
Sales,
RELATED(Product[IsCurrentWeek]) = TRUE()
)
)
I typed that in free hand. Didn't test it at all, but you get the idea of what I am considering here.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingThis is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
9 | |
6 | |
6 | |
6 | |
6 |
User | Count |
---|---|
9 | |
9 | |
8 | |
6 | |
6 |