Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
jburbano
Frequent Visitor

Line graph to compare number values per date, but with no date on one, getting just one value

For any DateTime that exists in both, it is fine and the line graph shows both values, but for any DateTime, that a row exists for one and not the other, it will only show the one.  Since this is an aggregation of totals, the total should still be the same as the prior available DateTime value.

 

CICreated.pngCDCreated.pngCICDTotals.pngCICDTotalsGood.png

 

10/17 only exists for one, so only shows the one, but the value of CDTotals that was last valid should show instead. The DAX I am using for the measures is as follows:

 

CDTotals =
CALCULATE (
DISTINCTCOUNT(CDCreatedOn[release.releaseDefinition.name]),
FILTER (
ALLEXCEPT ( CDCreatedOn, CDCreatedOn[release.releaseDefinition.name] ),
CDCreatedOn[CreatedOn] <= MAX ( CDCreatedOn[CreatedOn] )
)
)
 
and
 
CITotals =
CALCULATE (
DISTINCTCOUNT(CICreatedOn[build.definition.name]),
FILTER (
ALLEXCEPT ( CICreatedOn, CICreatedOn[build.definition.name] ),
CICreatedOn[QueuedFilter] <= MAX ( CICreatedOn[QueuedFilter] )
)
)
 
The values are correct, but I need it to auto fill in the gaps. I'm hoping I don't have to do so int he table, by adding dummy rows with empty strings, as then my DAX will need to account for the DISTINCTCOUNT of the single empty string and then I need the M expression to auto fill-in the missing dates.
2 REPLIES 2

Hi @jburbano,

 

are you using a date table?

 

Regards,

Marcus

Dortmund - Germany
If I answered your question, please mark my post as solution, this will also help others.
Please give Kudos for support.

Did I answer your question?
Please mark my post as solution, this will also help others.
Please give Kudos for support.

Marcus Wegener works as Full Stack Power BI Engineer at BI or DIE.
His mission is clear: "Get the most out of data, with Power BI."
twitter - LinkedIn - YouTube - website - podcast


Yes, I am:

 

let
Source = Table.Combine({Table.SelectColumns(Releases, {"release.createdOn"}), Table.SelectColumns(Builds, {"build.queueTime"})}),
#"Merged Columns" = Table.CombineColumns(Table.TransformColumnTypes(Source, {{"release.createdOn", type text}, {"build.queueTime", type text}}, "en-US"),{"release.createdOn", "build.queueTime"},Combiner.CombineTextByDelimiter("", QuoteStyle.None),"DateTimes"),
#"Changed Type" = Table.TransformColumnTypes(#"Merged Columns",{{"DateTimes", type datetime}}),
#"Extracted Date" = Table.TransformColumns(#"Changed Type",{{"DateTimes", DateTime.Date, type date}}),
#"Removed Duplicates" = Table.Distinct(#"Extracted Date"),
#"Sorted Rows" = Table.Sort(#"Removed Duplicates",{{"DateTimes", Order.Ascending}}),
#"DateTimesList" = List.Dates(
Record.Field(Table.Min(#"Sorted Rows", "DateTimes"), "DateTimes"),
Duration.Days(
Record.Field(Table.Max(#"Sorted Rows", "DateTimes"), "DateTimes")-Record.Field(Table.Min(#"Sorted Rows", "DateTimes"), "DateTimes")
),
#duration(1, 0, 0, 0)
),
#"Converted to Table" = Table.FromList(DateTimesList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Changed Type1" = Table.TransformColumnTypes(#"Converted to Table",{{"Column1", type date}}),
#"Renamed Columns" = Table.RenameColumns(#"Changed Type1",{{"Column1", "DateTimes"}})
in
#"Renamed Columns"

 

This gives me a tables of dates from the MIN date of either Builds or Releases and then find the count by subtracting the MIN date of CI or CD from MAX date and set the step to days.  

 

Min.pngMax.png

Would it be a matter of combining tables by the Dates column?

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.