Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.

Reply
PBOBOP
Frequent Visitor

Creating A Line Chart To Reflect No of Items in a Location by Month

Hello,

 

I would like to create a line chart that reflects how many items are at a certain location in a particular month.

The x-axis will be month, and the y-axis is the number of items. 

The table below is an example of data I have for a particular item.

ItemDateFromTo
XXXX1Wednesday, 20 February 2019HomePart Time Job
XXXX1Friday, 12 April 2019Part Time JobJobless
XXXX1Wednesday, 15 May 2019JoblessUniversity
XXXX1Friday, 20 December 2019UniversityPart Time Job
XXXX1Thursday, 8 October 2020Part Time JobHigh School
XXXX1Thursday, 7 September 2023High SchoolPart Time Job
XXXX1Monday, 11 September 2023Part Time JobCollege Prep
XXXX1Friday, 27 October 2023College PrepUniversity
XXXX1Thursday, 22 February 2024UniversityWork

 

You can see the item was at University from 27 October 2023 to 22 February 2024. 

At the moment, the line graph only shows how many items entered a location in a month, and item XXXX1 will only appear to be in University in the month of October 2023 because that is the date when it moved to University.

However, I want it to appear in University also for the months of Nov 2023, Dec 2023, Jan 2024 and Feb 2024, because it was in University for those months. In the line graphs below, I have filtered "To" to "University". 10 items entered "University" on 27 October 2023, and another item entered "University" on 12 Dec 2023. However, there should be 11 items altogether in Dec because the 10 items that entered University in October are still there (No items entered in November).

 

Is this possible?

PBOBOP_1-1709858725012.png

PBOBOP_2-1709858744443.png

 

 

1 ACCEPTED SOLUTION
BA_Pete
Super User
Super User

Hi @PBOBOP ,

 

While @Jonvoge 's solution will make getting daily counts very easy, it will also make your data VERY big VERY quickly.

I would recommend making some small transformations to get the 'To' date on each row like this:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZFfC4IwFMW/ysVnH9wqrMcoIgJJyCiIHqZecjCdXFfgt28mlvZvT4Od37nn3J1OztEe5rjOAdMCq1TULnAPVhjTVVBt72xmX9c6R+fsvuQrkg8t4zAvSapOGAoyEMkcYaPjAdEbwCYQiKe3FSqsqq/2NsoSE8xjpE6+L+QNqZKmHhBRdqXWfQrbxOiW4N7fTC/Ihx2WphvER01neclgl2RaqwEU6KJtwT6Z36Oejfx+vIZZaKXwghASlj/Scd7/ET5+W8P5Dg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Item = _t, Date = _t, From = _t]),
    chgTypes = Table.TransformColumnTypes(Source,{{"Date", type date}}),
    sortItemDate = Table.Sort(chgTypes,{{"Item", Order.Ascending}, {"Date", Order.Ascending}}),
    addIndex1 = Table.AddIndexColumn(sortItemDate, "Index1", 1, 1, Int64.Type),
    addIndex0 = Table.AddIndexColumn(addIndex1, "Index0", 0, 1, Int64.Type),
    mergeItemIndex = Table.NestedJoin(addIndex0, {"Item", "Index1"}, addIndex0, {"Item", "Index0"}, "addIndex0", JoinKind.LeftOuter),
    expandDateTo = Table.ExpandTableColumn(mergeItemIndex, "addIndex0", {"Date"}, {"DateTo"}),
    remOthCols = Table.SelectColumns(expandDateTo,{"Item", "From", "Date", "DateTo"})
in
    remOthCols

BA_Pete_0-1709884966915.png

 

Then use a measure to get the count of items on any given day, like this:

_itemsInLocationOverTime = 
VAR __cDate = MAX(dimCalendar[Date])
RETURN
CALCULATE(
    DISTINCTCOUNT(yourTable[Item]),
    FILTER(
        yourTable,
        __cDate >= yourTable[Date]
        && ( __cDate <= yourTable[DateTo] || ISBLANK(yourTable[DateTo]) )
    )
)

 

This assumes you have an UNRELATED calendar table (dimCalendar), and you must use a dimCalendar field on the visual axis along with this measure.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




View solution in original post

5 REPLIES 5
BA_Pete
Super User
Super User

Hi @PBOBOP ,

 

While @Jonvoge 's solution will make getting daily counts very easy, it will also make your data VERY big VERY quickly.

I would recommend making some small transformations to get the 'To' date on each row like this:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZFfC4IwFMW/ysVnH9wqrMcoIgJJyCiIHqZecjCdXFfgt28mlvZvT4Od37nn3J1OztEe5rjOAdMCq1TULnAPVhjTVVBt72xmX9c6R+fsvuQrkg8t4zAvSapOGAoyEMkcYaPjAdEbwCYQiKe3FSqsqq/2NsoSE8xjpE6+L+QNqZKmHhBRdqXWfQrbxOiW4N7fTC/Ihx2WphvER01neclgl2RaqwEU6KJtwT6Z36Oejfx+vIZZaKXwghASlj/Scd7/ET5+W8P5Dg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Item = _t, Date = _t, From = _t]),
    chgTypes = Table.TransformColumnTypes(Source,{{"Date", type date}}),
    sortItemDate = Table.Sort(chgTypes,{{"Item", Order.Ascending}, {"Date", Order.Ascending}}),
    addIndex1 = Table.AddIndexColumn(sortItemDate, "Index1", 1, 1, Int64.Type),
    addIndex0 = Table.AddIndexColumn(addIndex1, "Index0", 0, 1, Int64.Type),
    mergeItemIndex = Table.NestedJoin(addIndex0, {"Item", "Index1"}, addIndex0, {"Item", "Index0"}, "addIndex0", JoinKind.LeftOuter),
    expandDateTo = Table.ExpandTableColumn(mergeItemIndex, "addIndex0", {"Date"}, {"DateTo"}),
    remOthCols = Table.SelectColumns(expandDateTo,{"Item", "From", "Date", "DateTo"})
in
    remOthCols

BA_Pete_0-1709884966915.png

 

Then use a measure to get the count of items on any given day, like this:

_itemsInLocationOverTime = 
VAR __cDate = MAX(dimCalendar[Date])
RETURN
CALCULATE(
    DISTINCTCOUNT(yourTable[Item]),
    FILTER(
        yourTable,
        __cDate >= yourTable[Date]
        && ( __cDate <= yourTable[DateTo] || ISBLANK(yourTable[DateTo]) )
    )
)

 

This assumes you have an UNRELATED calendar table (dimCalendar), and you must use a dimCalendar field on the visual axis along with this measure.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




Hi @PBOBOP, regarding this @BA_Pete query. I don't know how many lines do you have in your database, but this one should be even faster if you have a lot of lines.

 

 

let
    FnShift = (tbl as table, col as text, shift as nullable number, optional newColName as text) as table =>
    //v 3. parametri zadaj zaporne cislo ak chces posunut riadky hore, kladne ak dole, 4. je nepovinny (novy nazov stlpca)
    let
        a = Table.Column(tbl, col),
        b = if shift = 0 or shift = null then a else if shift > 0
            then List.Repeat({null}, shift) & List.RemoveLastN(a, shift)
            else List.RemoveFirstN(a, shift * -1) & List.Repeat({null}, shift * -1),    
        c = Table.FromColumns(Table.ToColumns(tbl) & {b}, Table.ColumnNames(tbl) &
              ( if newColName <> null then {newColName} else
                if shift = 0 then {col & "_Duplicate"} else
                if shift > 0 then {col & "_NextValue"} 
                else              {col & "_PrevValue"} ))
    in
        c,
    
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZFfC4IwFMW/ysVnH9wqrMcoIgJJyCiIHqZecjCdXFfgt28mlvZvT4Od37nn3J1OztEe5rjOAdMCq1TULnAPVhjTVVBt72xmX9c6R+fsvuQrkg8t4zAvSapOGAoyEMkcYaPjAdEbwCYQiKe3FSqsqq/2NsoSE8xjpE6+L+QNqZKmHhBRdqXWfQrbxOiW4N7fTC/Ihx2WphvER01neclgl2RaqwEU6KJtwT6Z36Oejfx+vIZZaKXwghASlj/Scd7/ET5+W8P5Dg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Item = _t, Date = _t, From = _t]),
    ChangedType1 = Table.TransformColumnTypes(Source,{{"Date", type date}}),
    GroupedRowsWithAddDateTo = Table.Group(ChangedType1, {"Item"}, {{"All", each FnShift(_, "Date", -1, "Date To") , type table}}),
    Combined = Table.Combine(GroupedRowsWithAddDateTo[All]),
    ChangedType2 = Table.TransformColumnTypes(Combined,{{"Date To", type date}})
in
    ChangedType2

 


Note: Check this link to learn how to use my query.
Check this link if you don't know how to provide sample data.

Hi @BA_Pete,

 

Thank you for your suggestion, 

I am stuck at the mergeItemIndex step, when I merge it, all the numbers in Index1 are there. However after I "expandDateTo", the last line disappears as there is no matching number from Index0, instead of returning "null" like what you have in your table above, and items 109789 and 109790 completely disappear because they only have 1 line in Index1

 

mergeItemIndex: 

PBOBOP_1-1710197733077.pngPBOBOP_2-1710198073833.png

expandDateTo:

PBOBOP_3-1710198153391.pngPBOBOP_4-1710198231556.png

 

Do you know how I can get "null" to return for the lines with no match for Index1? 

 

Have you checked that these rows haven't just been re-sorted to the end of the table or something?

There's no reason why a Left Outer merge would remove any rows - that's the point of a Left Outer - but merges in PQ do very often change the sort order of your table, so maybe all the rows with null [DateTo] values have been dumped to the bottom of your table. Try filtering the post-merge query on [DateTo] = null to see if the rows exist.

 

If these rows do indeed appear to be missing, then you may need to send over some copyable sample data so I can try to reproduce the issue.

 

Pete



Now accepting Kudos! If my post helped you, why not give it a thumbs-up?

Proud to be a Datanaut!




Jonvoge
Super User
Super User

Hi PBOBOP

 

You need to tranform your data into a proper fact with Power Query / Transform Data, so that you have one row for each item per day, detailing the current location of the Item.

 

Try using the M-Code below:

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZJRa8IwEMe/SshzHkxU3B7HpDhBJtjhoPQhrYcNS3vlGgf99ouWbLVrDSG54/K7+9+RJOFchJ2KhL85KL291g78FRFevRhvsU+/pHePcKqgOelWMDVjEWR00dR6Wz776AbLK7rX5FhsSmBbzO7wiMyNlYq91GRsAO8Jwf1poWmmSssl2+nfquGx4B+V+QZqjGtHq3rFa8ihzIAC2yMe6Y6LC3Wln9h77rDLoGYj0jfmXLBDXiDaiRQrdoDaBRlqPoAe6dhh1Q1A/s8xFPKK1sIZ2J6gHh/Hqt/LfIhMjvOvE6X6X0AthgM9In3xNP0B", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}}),
    #"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
    #"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"", type text}, {"_1", type text}, {"_2", type text}, {"_3", type text}}),
    #"Promoted Headers1" = Table.PromoteHeaders(#"Changed Type1", [PromoteAllScalars=true]),
    #"Changed Type2" = Table.TransformColumnTypes(#"Promoted Headers1",{{"Item", type text}, {"Date", type date}, {"From", type text}, {"To", type text}}),
    #"Renamed Columns" = Table.RenameColumns(#"Changed Type2",{{"Date", "Start Date of Current Location"}, {"From", "Previous Location"}, {"To", "Current Location"}}),
    #"Added Index" = Table.AddIndexColumn(#"Renamed Columns", "Index", 1, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "End Date of Current Location", each try #"Added Index" [Start Date of Current Location] {[Index]} otherwise null, Date.Type),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom", each try List.Dates([Start Date of Current Location], Duration.Days([End Date of Current Location] -[Start Date of Current Location]) +1 ,#duration(1,0,0,0) ) otherwise null),
    #"Expanded Custom" = Table.ExpandListColumn(#"Added Custom1", "Custom"),
    #"Changed Type3" = Table.TransformColumnTypes(#"Expanded Custom",{{"Custom", type date}}),
    #"Renamed Columns1" = Table.RenameColumns(#"Changed Type3",{{"Custom", "Date"}})
in
    #"Renamed Columns1"

 

First the I create a new column with an Index starting from 1.

Then I add a new column where I get the Date value of the next row, in order to get Start and End Date of the "To" location on one row.

Then I Create a List of Dates between the Start Date and End Date in a new column, and expand that List to Rows, to get one row for each day:

 

Jonvoge_0-1709884549029.png

 

You can use this table to then count the number of items in each location on each day.

_____________________________________________________
I hope my comment was helpful.
If your question was answered, please mark your post as 'Solved' and consider giving me a 'Thumbs Up'.
Find me on LinkedIn, Sessionize, or my blog Downhill Data

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors