Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
LorenzoMolteni
Frequent Visitor

Power Query - Bridge table - Weeknum, Date

Hi all,

 

I'm new to Power Query and looking for a solution to solve a small issue I have with my report.

 

I'm looking into sales data. My goal to is to get data from two different sources (Selection of weekly products & Daily Sales) - I do it by automatically import data from given folders - and create a moving dashboard on excel with powerpivot.

 

My two sources are built like this

Selection of weekly products

WEEKUpload dateProduct No.BrandCategory
3209.08.2020P001Guccishoes
3209.08.2020P002Saint Laurentbags
3209.08.2020P003Gucci

clothing

 

WEEKUpload dateProduct No.BrandCategory
3316.08.2020P009Guccishoes
3316.08.2020P010Valentinobags
3316.08.2020P011Gucci

clothing

 

Daily Sales

Sale DateNo.QTYRevenue
10.08.2020P0011100
10.08.2020P0034250

 

Sale DateNo.QTYRevenue
16.08.2020P0012200
16.08.2020P0103600

 

What I want to achieve on my pivot in excel - by creating connections - is that when I filter by i.e. WEEK 32 I get sale numbers only from that week and not from the next weeks (since the product will still be sold even after those days - but I'm only interested in the sales of the first week). The problem is occurring mainly due to the fact that I'm not able to create bridge tables because from one source I get the Weeknum and from the other one I get the Sale Date.

 

Please let me know if something is unclear. I thank you in advace for your help!

 

Best,

 

Lorenzo

 

 

21 REPLIES 21
Anonymous
Not applicable

let me know if what follows is close to your goal:

 

query Product:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjZQ0lEKMDAwBFKJiUAiGcQyMtM3MNc3MjAyUIrVASoyhCgyAlJJSSBFIJaBkb6BBaYiYxSTcCgyAckn41JkBFFkCrcOZKaBJVZFZnDr0BXFAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Week = _t, Product = _t, Brand = _t, Category = _t, #"Upload date" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Week", Int64.Type}, {"Product", type text}, {"Brand", type text}, {"Category", type text}, {"Upload date", type date}},"it-IT"),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Qty", each List.Sum(Table.SelectRows(sales, (r)=> (r[wk]=[Week] and r[Product]=[Product] ))[QTY])),
    
    #"Added Custom 1" = Table.AddColumn(#"Added Custom", "rev", each List.Sum(Table.SelectRows(sales, (r)=> (r[wk]=[Week] and r[Product]=[Product] ))[Revenue]))
    
in     #"Added Custom 1"

 

 

query sales:

 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCjAwMFTSUQJjAwMgaWSmb2Cub2RgZKAUqwOXNwJhiLw5NnkTEAbLGxjrG1igyBtD9RuD5Q0N0OXNoPabokvHAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Product = _t, QTY = _t, Revenue = _t, #"Sale Date" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{"Sale Date", type date}),
    #"Changed Type2" = Table.TransformColumnTypes(#"Changed Type",{{"QTY", Int64.Type}, {"Revenue", Int64.Type}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type2", "wk", each Date.WeekOfYear([Sale Date],2)),
    #"Changed Type1" = Table.TransformColumnTypes(#"Added Custom",{{"wk", Int64.Type}})
in
    #"Changed Type1"

 

 

 with the added auxiliar column wk

The wk column was "forcibly" obtained from the "sale dates" column in order to have the same result for 7/26 and 7/27, which could be two days of different weeks, according to other conventions.image.png

 

 

e questo è il risultato:

 

image.png

 

 

Hi @Anonymous

 

Thanks for your support and sorry for my late reply.

I think this may actually work, but do you have any idea why it's extremely slow to upload and refresh? Despite the fact I don't have a huge amount data to study, it's still taking hours to refresh every time.

 

Thanks a lot!

 

Lorenzo

Anonymous
Not applicable

<<... think this may actually work, but do you have any idea why it's extremely slow to upload and refresh? Despite the fact I don't have a huge amount data to study, it's still taking hours to refresh every time. ..>>

 

 
I tried with the data you provided, making them "proliferate" as you do with lactic ferments to make yogurt, obtaining a sales table of more than 5000 rows and a table of products of 50 rows.
using the old algorithm it took a few minutes (perhaps less than a minute, measured by "hand").
So I have no idea if your size is why it takes so long on your pc.
In fact the code is not meant to be efficient. I just tried to be clear.
Therefore, for example, I repeated for two you want the same crossing of comparisons between the two tables (sales and product). once for the quantities and a second for the revenue.
This means 5000X50 = 250000 theoretical comparisons repeated twice. So 500,000 comparisons, which in fact are not many.
 
A first cut in time could be obtained by combining these two operations into one (therefore, theoretically, halving the comparisons from 500K to 250K)

with this ugly expression:

 

# "Added Custom" = Table.AggregateTableColumn (Table.AddColumn (# "Changed Type", "QR", each Table.SelectRows (sales, (r) => (r [wk] = [Week] and r [Product] = [Product])) [[QTY], [Revenue]]), "QR", {{"QTY", List.Sum, "sumQ"}, {"Revenue", List.Sum, "sumR"}} )

 

 

Another way, which I have used on other occasions, is to do a groupby of the tables by product and week and then limit the searches to subgroups.
to give an example, if I have to search by crossing two tables of 1000 rows each I need 1M comparisons.

If I can divide the two tables into 10 pieces of 100 that correspond to each other, I have to make 10X100X100 = 100k comparisons, therefore a tenth compared to before.
 
all that said, you could give some more details on the size of your tables.

It is not clear to me how many weeks and therefore how many rows the product table has.
 
 
 
 

Hi @Anonymous 

 

Thanks for the support!
Unfortunately none or the above suggestions is drastically speeding my performance up.

As I said above, I have 1 Product List file adding every week (only 50 items/rows a week max) and 1 Sales file adding every single day - 5 days a week - (around 300 rows every single day). This is the size of my data.

If for example, I take into consideration one month, I may have 200 rows for the Products Lists and around 7.000 for sales. 

Is this kinda of clear for you now?

 

Thank you!

Anonymous
Not applicable

I'm sorry but I don't remember the details of the subject anymore.
Now I don't have time to resume and deepen.
I will do it as soon as I can, as I understand it is not an urgent thing.

Recycling Lists: 50 rows a week

LorenzoMolteni_1-1600158031069.png

 

 

let
    Source = Folder.Files("\\mytheresa.local\Files\Public\MERCHANDISING RETAIL\RECYCLING\KW\Recycling Lists"),
    #"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
    #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from Recycling Lists", each #"Transform File from Recycling Lists"([Content])),
    #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
    #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from Recycling Lists"}),
    #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Recycling Lists", Table.ColumnNames(#"Transform File from Recycling Lists"(#"Sample File"))),
    #"Added Custom" = Table.AddColumn(#"Expanded Table Column1", "Week", each Date.WeekOfYear([Latest Recycling Date])),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Source.Name"}),
    #"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"No.", "No"}}),
    #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"Core Item", type text}, {"Brand", type text}, {"No", type text}, {"Main Waregroup", type text}, {"Item Category", type text}, {"Latest Recycling Date", type date}, {"Week", Int64.Type}}),
    #"Added Custom1" = Table.AggregateTableColumn (Table.AddColumn (#"Changed Type", "QR", each Table.SelectRows (DOR, (r) => (r [Date] = [Latest Recycling Date] and r [No] = [No])) [[QTY], [Revenue Net]]), "QR", {{"QTY", List.Sum, "sumQ"}, {"Revenue", List.Sum, "sumR"}} )
in
    #"Added Custom1"

 

 

Recycling Lists: 300 rows a day

 

LorenzoMolteni_0-1600158277328.png

 

 

 

let
    Source = Folder.Files("\\mytheresa.local\Files\Public\MERCHANDISING RETAIL\RECYCLING\KW\DOR"),
    #"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
    #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from DOR", each #"Transform File from DOR"([Content])),
    #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
    #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from DOR"}),
    #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from DOR", Table.ColumnNames(#"Transform File from DOR"(#"Sample File (2)"))),
    #"Added Custom" = Table.AddColumn(#"Expanded Table Column1", "Date", each Text.Range([Source.Name],0,8)),
    #"Changed Type" = Table.TransformColumnTypes(#"Added Custom",{{"Date", type date}}),
    #"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Source.Name", "Style", "Item Brand", "Main Waregroup", "Department Group", "Onlineset", "For Sale Date", "Season", "Is Core Item", "Is Exclusive", "Colour", "sale price", "Qty Discount", "Revenue Discount"})
in
    #"Removed Columns"

 

 

 

 

Anonymous
Not applicable

a different way, grouping the sales table

 

image.png

 

 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCjAwMFTSUQJjAwMgaWSmb2Cub2RgZKAUqwOXNwJhiLw5NnkTEAbLGxjrG1igyBtD9RuD5Q0N0OXNoPabokvHAgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Product = _t, QTY = _t, Revenue = _t, #"Sale Date" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Sale Date", type date},{"QTY", Int64.Type}, {"Revenue", Int64.Type}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "wk", each Date.WeekOfYear([Sale Date],2), type number),
    #"Grouped Rows" = Table.Group(#"Added Custom", {"Product","wk"}, {{"all", each _, type table [Product=text, QTY=nullable number, Revenue=nullable number, Sale Date=nullable date]}})
in
    #"Grouped Rows"

 

 

 

then doing research on subgroups should be faster

 

image.png

 

 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("dc1BDoAgDATAv/RMwlIQ9RfeCQfg/3+wDUqU6KXbpJNtSuRBhg7ASZQio+nG0WK1DAZlI8h1xBK1KtINbLE9EHfkX01uQldT0Hu70U/TMt5pJ/ZPFMe7GeUT", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Week = _t, Product = _t, Brand = _t, Category = _t, #"Upload date" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Week", Int64.Type}, {"Product", type text}, {"Brand", type text}, {"Category", type text}, {"Upload date", type date}},"it-IT"),
    ac1= Table.ExpandListColumn(Table.AddColumn( #"Changed Type", "QR", each List.Select(Table.SelectRows(salesGrp, (r)=> (r[Product]=[Product]))[all], (l)=>l[wk]{0}=[Week] )),"QR"),
    tac= Table.AggregateTableColumn(ac1, "QR",{{"QTY",List.Sum,"sumQ"},{"Revenue",List.Sum,"sumR"}})

in
    tac

 

 

 

but to test and possibly improve the speed of execution we need the details (only the structur and sizes, of course)of the real dataset you are working on

Anonymous
Not applicable

I'm sorry, but I no longer remember the details of this problem.
Since seems that there is a performance problem, to try to find a useful solution you should also provide the dimensional data or better the tables (even with fake data, but with the same structure as your real data) on which to reason to try any changes to make the algorithm more efficient.

Anonymous
Not applicable

peraphs a merge could be more performant:

 

 

let
    Source = Table.NestedJoin(productm, {"Product", "Week"}, sales, {"Product", "wk"}, "product.1", JoinKind.FullOuter),
    #"Aggregated product.1" = Table.AggregateTableColumn(Source, "product.1", {{"QTY", List.Sum, "Sum of product.1.QTY"}, {"Revenue", List.Sum, "Sum of product.1.Revenue"}})
in
    #"Aggregated product.1"

 

 

this is the same sales table

 

image.png

 

 

ImkeF
Community Champion
Community Champion

Hi @Anonymous ,

if my understanding is correct, you can significantly speed up your first query using this method: https://www.thebiccountant.com/2017/05/29/performance-tip-partition-tables-crossjoins-possible-powerquery-powerbi/

 

With regards to your second suggestion, I'd expect it to be very slow as well. Here I've described how to speed it up: https://www.thebiccountant.com/2019/10/28/performance-tip-for-aggregations-after-joins-in-power-query-and-power-bi/

 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Hi @Anonymous 

 

As explained above, here the two sources I have and the relative queries

 

Proudcts List (1x week)

No.Brand DescriptionDescriptionMain Waregroup DescriptionPre Shopping Date
P00472852Paco Rabanneshoulder bagsbags05.07.2020
P00478613Zimmermannjumpsuitsclothing05.07.2020
P00478618Zimmermanndressesclothing05.07.2020
P00481172Bottega Venetapumpsshoes05.07.2020
P00481332Bottega Venetahigh sandalsshoes05.07.2020
P00482376Dries van Notenhigh sandalsshoes05.07.2020
P00482532Bottega Venetasunglassesaccessoires05.07.2020
P00483132Bottega Venetaclutchesbags05.07.2020
P00483185Bottega Venetashoulder bagsbags05.07.2020
P00483230Off-Whitetotesbags05.07.2020
P00483529Saint Laurentbeltsaccessoires05.07.2020
P00483897Off-Whiteskirtsclothing05.07.2020
P00483908Off-Whitejacketsclothing05.07.2020
P00484251Balmaintopsclothing05.07.2020
P00484294Balmainskirtsclothing05.07.2020
P00487097Alexandre Vauthierskirtsclothing05.07.2020
P00487691Loewepumpsshoes05.07.2020
P00487799Bottega Venetapantsclothing05.07.2020
P00487823Bottega Venetadressesclothing05.07.2020
P00487955Guccihats/hair accessoriesaccessoires05.07.2020
P00487956Guccihats/hair accessoriesaccessoires05.07.2020
P00487957Guccihats/hair accessoriesaccessoires05.07.2020
P00488001Guccishoulder bagsbags05.07.2020
P00488352Valentinotopsclothing05.07.2020
P00488545Loewejacketsclothing05.07.2020
P00488559Tom Fordpumpsshoes05.07.2020
P00488571Tom Fordhigh sandalsshoes05.07.2020
P00488591Loewejacketsclothing05.07.2020
P00488670Loewetopsclothing05.07.2020
P00488753Valentinopantsclothing05.07.2020
P00489313Givenchypumpsshoes05.07.2020
P00489827Saint Laurentshortsclothing05.07.2020
P00489853Saint Laurentcapes/ponchosclothing05.07.2020
P00489868Saint Laurentdressesclothing05.07.2020
P00490811Stella McCartneyankle bootsshoes05.07.2020
P00491849Fendidressesclothing05.07.2020
P00496045Givenchydressesclothing05.07.2020
P00496717Guccitopsclothing05.07.2020
P00496786Dolce & Gabbanadressesclothing05.07.2020
P00496787Dolce & Gabbanadressesclothing05.07.2020
P00497283Rick Owensdressesclothing05.07.2020
P00497853Balmainswimwearclothing05.07.2020
P00498363Guccisweatshirtsclothing05.07.2020
P00499108Costarelloscocktail/gownsclothing05.07.2020
P00499296Chloédressesclothing05.07.2020
P00499632The Atticodressesclothing05.07.2020
P00499638The Atticopantsclothing05.07.2020
P00499715Paco Rabannedressesclothing05.07.2020
P00491734Fendijacketsclothing05.07.2020
P00499746Paco Rabannecasual topsclothing05.07.2020

 

 

let
    Source = Folder.Files("T:\Public\MERCHANDISING RETAIL\Pre-Shopping\Pre-Shopping Lists"),
    #"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
    #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from Pre-Shopping Lists", each #"Transform File from Pre-Shopping Lists"([Content])),
    #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
    #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from Pre-Shopping Lists"}),
    #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Pre-Shopping Lists", Table.ColumnNames(#"Transform File from Pre-Shopping Lists"(#"Sample File"))),
    #"Added Custom" = Table.AddColumn(#"Expanded Table Column1", "Week", each Date.WeekOfYear([Pre Shopping Date])),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Source.Name"}),
    #"Reordered Columns" = Table.ReorderColumns(#"Removed Columns",{"Week", "No.", "Brand Description", "Description", "Main Waregroup Description"}),
    #"Renamed Columns" = Table.RenameColumns(#"Reordered Columns",{{"No.", "No"}}),
    #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"Pre Shopping Date", type date}, {"Week", Int64.Type}, {"No", type text}, {"Brand Description", type text}, {"Description", type text}, {"Main Waregroup Description", type text}}),
    #"Added Custom1" = Table.AddColumn(#"Changed Type", "Qty", each List.Sum(Table.SelectRows(DOR, (r)=> (r[Week]=[Week] and r[No]=[No] ))[QTY])),
    #"Added Custom2" = Table.AddColumn(#"Added Custom1", "Revenue", each List.Sum(Table.SelectRows(DOR, (r)=> (r[Week]=[Week] and r[No]=[No] ))[Revenue Net]))
in
    #"Added Custom2"

 

 

Daily Order Report (1x Day) avg. 5.000 rows

DescriptionStyleItem BrandMain WaregroupDepartment GroupOnlinesetFor Sale DateSeasonIs Core ItemIs ExclusiveColourQTYsale priceRevenue NetQty DiscountRevenue Discount
P00394784 beltsUnassignedOff-WhiteaccessoriesWOMENSWEAR2000_03_108/8/201920/1NoNo6010 - YELLOW BLACK30862.340 €302.340 €
P00435551 sneakersUnassignedStella McCartneyshoesWOMENSWEAR0811/29/201920/1NoNo9099 - WH/MULTI181752.704 €182.704 €
P00440255 hats/hair accessoriesUnassignedJacquemusaccessoriesWOMENSWEAR2020_03_163/13/202020/1NoNoNATURAL - NATURAL132272.019 €132.019 €
P00445233 shoulder bagsUnassignedLoewebagsMENSWEAR2020_06_011/15/202020/1NoNo2530 - BROWN121751.507 €121.507 €
P00422869 beltsUnassignedOff-WhiteaccessoriesWOMENSWEAR1512/9/201920/1NoYes0110 - WHITE BASE/BLACK WRITING11107792 €11792 €
P00494290 espadrillesUnassignedCastañershoesWOMENSWEAR2020_06_096/8/202020/2NoNo100 - NEGRO10105888 €  
P00394419 totesUnassignedOff-WhitebagsWOMENSWEAR2000_03_105/29/201920/1NoNo1001 - BLACK WHITE92341.564 €91.564 €
P00452928 sneakersUnassignedAcne StudiosshoesMENSWEAR2020_06_103/17/202020/1NoNoIVORY WHIT - IVORY WHITE9156902 €9902 €
P00450812 shoulder bagsUnassignedJil SanderbagsWOMENSWEAR2020_02_284/1/202020/1NoNo680 - CANDY93301.802 €91.802 €

 

 

let
    Source = Folder.Files("\\mytheresa.local\Files\Public\MERCHANDISING RETAIL\RECYCLING\KW\DOR"),
    #"Filtered Hidden Files1" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),
    #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Hidden Files1", "Transform File from DOR", each #"Transform File from DOR"([Content])),
    #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
    #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from DOR"}),
    #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from DOR", Table.ColumnNames(#"Transform File from DOR"(#"Sample File (2)"))),
    #"Added Custom" = Table.AddColumn(#"Expanded Table Column1", "Date", each Text.Range([Source.Name],0,8)),
    #"Changed Type" = Table.TransformColumnTypes(#"Added Custom",{{"Date", type date}}),
    #"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Source.Name", "Style", "Item Brand", "Main Waregroup", "Department Group", "Onlineset", "For Sale Date", "Season", "Is Core Item", "Is Exclusive", "Colour", "sale price", "Qty Discount", "Revenue Discount"})
in
    #"Removed Columns"

 

 

Thank you!

 

Lorenzo

Pragati11
Super User
Super User

HI @LorenzoMolteni ,

 

Try creating a relationshipp between your 2 tables on Product No.

 

Thanks,

Pragati

Best Regards,

Pragati Jain


MVP logo


LinkedIn | Twitter | Blog YouTube 

Did I answer your question? Mark my post as a solution! This will help others on the forum!

Appreciate your Kudos!!

Proud to be a Super User!!

Hi @Pragati11 

 

That relationship is already created but doesn't help to solve the above metioned issue on WeekNum and Date.

 

Best,


Lorenzo

edhans
Super User
Super User

I do not think you need a bridge table. You need a Date table, with at least 2 fields. (Should have a few dozen, but these two are critical for you.)

  1. The date. Obviously. You'll mark the table as a date table in Power BI or Power Pivot, then this field as the date field.
  2. A week/year column. So 202001, 202002, etc. through 202052 or 202053 however it works out. You can do this by using this formula:
Date.Year([Date]) * 100 + Date.WeekOfYear([Date])

You need to add a new Week/Year column to your table that only has the week number. You need a date, or you can use this formula, though it is a bit of a hard-code.

2020 * 100 + Date.WeekOfYear([Date])​

Then use the date or week or year/week field from your date table, and relate the Date[Date] field to your sales date, and the Date[Year Week] to your Year/Week num field in the other table.

Here is a fairly comprehensive date table with this info, and below a link to show you how to make it dynamic.

let
    Source = {Number.From(#date(2020,1,1))..Number.From(#date(2020,12,31))},
    #"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), {"Date"}, null, ExtraValues.Error),
    #"Changed Type" = Table.TransformColumnTypes(#"Converted to Table",{{"Date", type date}}),
    #"Inserted Year" = Table.AddColumn(#"Changed Type", "Year", each Date.Year([Date]), Int64.Type),
    #"Inserted Month" = Table.AddColumn(#"Inserted Year", "Month", each Date.Month([Date]), Int64.Type),
    #"Inserted Month Name" = Table.AddColumn(#"Inserted Month", "Month Name", each Date.MonthName([Date]), type text),
    #"Added Short Month Name" = Table.AddColumn(#"Inserted Month Name", "Short Month Name", each Text.Start([Month Name],3), type text),
    #"Added Month Year" = Table.AddColumn(#"Added Short Month Name", "Month Year", each [Short Month Name] & " " & Text.From([Year]), type text),
    #"Added Month Year Sort" = Table.AddColumn(#"Added Month Year", "Month Year Sort", each [Year]*100 + [Month], Int64.Type),
    #"Inserted Quarter Number" = Table.AddColumn(#"Added Month Year Sort", "Quarter Number", each Date.QuarterOfYear([Date]), Int64.Type),
    #"Inserted Quarter" = Table.AddColumn( #"Inserted Quarter Number","Quarter", each "Qtr " & Text.From([Quarter Number]), type text),
    #"Inserted Quarter Year Sort" = Table.AddColumn(#"Inserted Quarter", "Quarter Year Sort", each [Year] * 10 + [Quarter Number], Int64.Type),
    #"Inserted Quarter Year" = Table.AddColumn(#"Inserted Quarter Year Sort", "Quarter Year", each "Q" & Text.End([Quarter],1) & " " & Text.From([Year]), type text),
    #"Inserted Week of Year" = Table.AddColumn(#"Inserted Quarter Year", "Week of Year", each Date.WeekOfYear([Date]), Int64.Type),
    #"Inserted Week of Month" = Table.AddColumn(#"Inserted Week of Year", "Week of Month", each Date.WeekOfMonth([Date]), Int64.Type),
    #"Inserted Day" = Table.AddColumn(#"Inserted Week of Month", "Day", each Date.Day([Date]), Int64.Type),
    #"Inserted Day of Week" = Table.AddColumn(#"Inserted Day", "Day of Week", each Date.DayOfWeek([Date]), Int64.Type),
    #"Inserted Day of Year" = Table.AddColumn(#"Inserted Day of Week", "Day of Year", each Date.DayOfYear([Date]), Int64.Type),
    #"Inserted Day Name" = Table.AddColumn(#"Inserted Day of Year", "Day Name", each Date.DayOfWeekName([Date]), type text),
    #"Added IsFuture Boolean" = Table.AddColumn(#"Inserted Day Name", "IsFuture", each [Date] > DateTime.Date(DateTime.LocalNow()), type logical),
    #"Added IsInCurrentWeek" = Table.AddColumn(#"Added IsFuture Boolean", "IsInCurrentWeek", each Date.IsInCurrentWeek([Date]), type logical),
    #"Added IsInCurrentMonth" = Table.AddColumn(#"Added IsInCurrentWeek", "IsInCurrentMonth", each Date.IsInCurrentMonth([Date]), type logical),
    #"Added IsInCurrentQuarter" = Table.AddColumn(#"Added IsInCurrentMonth", "IsInCurrentQuarter", each Date.IsInCurrentQuarter([Date]), type logical),
    #"Added IsInCurrentYear" = Table.AddColumn(#"Added IsInCurrentQuarter","IsInCurrentYear", each Date.IsInCurrentYear([Date]), type logical),
    #"Added Year Week" = Table.AddColumn(#"Added IsInCurrentYear", "Year Week", each Date.Year([Date]) * 100 + Date.WeekOfYear([Date]), Int64.Type)
in
    #"Added Year Week"

How to use M code provided in a blank query:
1) In Power Query, select New Source, then Blank Query
2) On the Home ribbon, select "Advanced Editor" button
3) Remove everything you see, then paste the M code I've given you in that box.
4) Press Done
5) See this article if you need help using this M code in your model.

 

Creating a Dynamic Date Table in Power Query

 



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Hi @edhans

 

Thanks for your help with this. Really much appreciated.

What I still can't understand is how i can relate Date[Year Week] to the new Year/Week num in my source table since each column contains duplicate value and PowerPivot doesn't allow me to do that.

How can I overcome this?

 

Thank you!

 

Lorenzo

ACK! You are correct. That is many to many. You'll need to convert that week number to a date. This will do that:

 

List.Min(
    {
        #date(2020+1,1,1),
        Date.AddDays(
            #date(2020,1,1),
            -Date.DayOfWeek(
                #date(2020,1,1),
                Day.Monday
                )+[Week of Year] * 7
            )
    }
)

 

You'll need to play with it and make sure it conforms to your week. My example starts on a Monday. You can change at at to Day.Sunday, Day.Wednesday, or whatever.

 

Then connect the Date[Date] field (unique) to this new pseudoweek day field in your sales table.

And I've hardcoded the 2020. You should put in a valid year field so it doesn't stop working in 4 months.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Hi @edhans 

 

Thank you! Unfortunately another problem is occurring right now, meaning that I'm not able anymore to build the relationship between my two sources (products lists and sales numbers). It remains inactive due to other relationships between sale and date.

Please see the screenshoot. I don't know how to solve this...

 

(Take everything you see here below as a try)

 
 

 

 PP.png

I'm not 100% clear on your tables. I know what is causing this inactive relationship though.

DIM tables are dimension tables and should describe your data. So a date table (describes dates), customer table (describes customers), product table (describes products.)

Your FACT tables are facts, or transactions, so sales, purchase, orders, etc.

FACT tables should not be related to each other, and DIM tables should not be related to each other.

 

You've related what looks to me like your DIM Prodcut Table to your DIM Date table. Why are you doing that? In other words, what date relationship do you care about that is between the dates in the date table and your product table?

 

I also don't know what your item table and product tables are. That sounds like the same thing to me. Can those be merged into a single table? I know I know almost nothing about your data, but to me, this should be 3 tables:
DIM Products
DIM Date
FACT Sales

The DIMs should be related 1 to many to the Sales table.

Post back with more info. You are getting deep into data modeling here, which is how you ultimately build a successful DAX model!



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Hi @edhans 

 

Thanks for your reply!

First of all, no need for the Items tabe. That was just a try I was giving to solve the issue.

As you just said, there should be 3 tables in Power Pivot:

- Products list: This is a selection of products that gets uploaded and goes on sale. I have a new selection/list once a week (uploading on Sunday). I get this from Power Query (import from folder). This contains the Product No.BrandPre-shopping date (date of upload), Week and now the Fake Date you suggested to create. I want to study the sales on these products on their first week only being live (i.e. if I filter for CW16 I only want to see sale performence for the products I uploaded in CW16 during their first week - not taking into account sales data from CW17 on)

- Sale: This is basically the daily order report for all products we sell. I also get this from Power Query (import from folder). We have here different data, but the ones we need to definetely have are Product No.QTYRevenue, Date (sale date).

Date: This is simply the date table you suggested to create with Date and Year/Week.

 

Given the above, I assume I have to create a connection between:

- Product List & Sales on Product No.

- Date Sales on Date

Product List & Date on Fake Date / Date

 

I may have misunderstood your first advices, but not sure of that. How shall I overcome the issue?

Please find attached how the final report looks like. My report occurs when filtering for more/different weeks.

 

Thanks so much!

 

Lorenzo

 

Final Report - error occurs when filtering by weeksFinal Report - error occurs when filtering by weeksPPIVOT.png

Why not create a true/false field in the product table that determines if this is the first week? Date.IsInCurrentWeek() will do it based on today (refresh date) but you can tweak that or calculate your own week number in there. I don't think there is a need to relate that to the date table specifially. Then just add a filter in your measures for this field.

 

edhans_0-1597352235260.png

New Sales = 
CALCULATE(
    SUM(Sales),
    FILTER(
        Sales,
        RELATED(Product[IsCurrentWeek]) = TRUE()
    )
)

I typed that in free hand. Didn't test it at all, but you get the idea of what I am considering here.

 



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.