Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
vmakker
Frequent Visitor

Scheduled refresh fails for one table in Power BI (date comparison issue in Power Query)

Hi, 

I have not seen this issue ever before.

Here is the problem and it started on Thursday (12 Dec 2024).

 

I have a report in Power BI, where I am getting some data from Fabric. I am getting two tables with columns, say: Key, and DateTime in each. I merge both the tables on Key column and then transform the DateTime column to Date and then filter for records where DateTime columns (now transformed to date) are identical. I go on to add more steps/columns to get what I need and save. 

It works completely fine on Desktop.

Things got interesting (read frustrating) on Thursday evening. I published the report with no issue, but when I performed a manual refresh just to make sure things were working I started getting the error seen in the screenshots attached. I couldn't put my finger on it, so I went back to my desktop to see if I missed something, but nothing. So I started troubleshooting and started deleting each step from Power Query one by one and republishing the report (it said that I converted a NULL to Number, but I have no such step), until I stumbled upon the step where I was filtering on dates being identical. Once I removed that step, there was no issue when I published it. But since I needed the dates to match, I created a Custom Column where identical dates returned 1 and filtered on that. Once I published that, I saw that the refresh worked fine, but no data loaded in the semantic model.

 

 

Now I have done these kinds of transformations/steps/publishing on web with the same sources for quite a while, so I am stumped. I have another visual in the same report from Fabric that doesn't use date comparisons and it works fine.

Sincerely hoping for some assistance. 

 

Thank you!

 

vmakker_0-1734225841257.pngvmakker_1-1734225875101.png

 

1 ACCEPTED SOLUTION

Hi @vmakker  ,

Please consider raising a support ticket for further assistance

To raise a support ticket for Fabric and Power BI, kindly follow the steps outlined in the following guide:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

Thank you.

View solution in original post

21 REPLIES 21
Arun_kumar
Frequent Visitor

Replace null to zero before covert the datatype

Yeah, I had tried that too. It didn't work. Anyway, had to perform all the clean up in SQL, so that took care of it. 

Hi @vmakker ,

Thank you for the response.If you still require any assistance please feel free to reachout.

Hi @vmakker ,

I just wanted to kindly follow up to see if you had a chance to review the previous response provided by me. Please let me know if it was helpful. If yes, please Accept the answer so that it will be helpful to others to find it quickly.

Thank you.

Hi @vmakker ,

Hi 
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.

Thank you.

Hi,

I can't accept it as a solution since it didn't solve the problem, I had to clean/preprocess the data in SQL. Since my question was regarding the issue with Power Query, I still don't know what caused it. 

Hi @vmakker  ,

Please consider raising a support ticket for further assistance

To raise a support ticket for Fabric and Power BI, kindly follow the steps outlined in the following guide:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

Thank you.

Hi @vmakker ,

We are following up once again regarding your query. Could you please confirm if the issue has been resolved through the support ticket with Microsoft?

If the issue has been resolved, we kindly request you to share the resolution or key insights here to help others in the community. If we don’t hear back, we’ll go ahead and close this thread.

Should you need further assistance in the future, we encourage you to reach out via the Microsoft Fabric Community Forum and create a new thread. We’ll be happy to help.

 

Thank you for your understanding and participation.

v-menakakota
Community Support
Community Support

Hi @vmakker ,

As you mentioned that replaced  the NULL values but still facing the same issue better try this process.After each step try to refresh the service .I hope this will get work for you

  • Import tables into Power Query without transformations.Publish and test refresh in Power BI Service.
  • Add one transformation (e.g.,  merge both the tables on Key column).Apply changes, publish, and refresh in Service.Verify success after each step like you have mentioned that  transform the DateTime column to Date .
  • Continue adding transformations one at a time.Test refresh after each step to isolate any failure.
  • If refresh fails, focus on the last step added.Troubleshoot for data type issues, query folding, or NULL values.

    If this post was helpful, please give us Kudos and consider marking Accept as solution to assist other members in finding it more easily.

 

Hi @vmakker ,

I just wanted to kindly follow up to see if you had a chance to review the previous response provided by me. Please let me know if it was helpful. If yes, please Accept the answer so that it will be helpful to others to find it quickly.

Thank you.

Hi, 

It didn't work. I ended up cleaning the data at SQL level rather than handling in Power Query.

v-menakakota
Community Support
Community Support

Hi @vmakker ,
Thank you for reaching out to us on the Microsoft Fabric Community Forum.

Since you mentioned an error related to converting NULL to a Number, explicitly handle any NULL values in your DateTime columns before performing transformations. You can do this by using the "Replace Values" function to replace NULLs with a default date or by using a conditional column to handle NULLs.
Ensure all nulls in critical columns (case_create_local_time_raw, EarliestResolution) are handled explicitly to avoid mismatches
Check the custom column logic to ensure it correctly identifies matching dates, as small differences in formatting or time zones might cause issues in the Service that don’t appear in Desktop.

Once after following above steps if still facing the same issue then try this alternatives:

  1. Incremental Refresh: If you are dealing with large datasets, consider using incremental refresh to optimize the data refresh process. This can help reduce the chances of encountering errors during full dataset refreshes.
  2. If possible, use Power BI Dataflows to handle complex transformations and then connect your report to the Dataflows. This can offer a more stable environment for processing complex data tasks.

    If this post was helpful, please give us Kudos and consider marking Accept as solution to assist other members in finding it more easily.
    Thank you.




Since you mentioned an error related to converting NULL to a Number, explicitly handle any NULL values in your DateTime columns before performing transformations. You can do this by using the "Replace Values" function to replace NULLs with a default date or by using a conditional column to handle NULLs.
Ensure all nulls in critical columns (case_create_local_time_raw, EarliestResolution) are handled explicitly to avoid mismatches




Yeah, I already did that. Still the same issue. I found another peculiarity as I was troubleshooting. When I look at the semantic model on the web, and look at the tables that are giving me issues, the date columns show up as blank (which is impossible because the source tables do not have any blank values). 

lbendlin
Super User
Super User

where I am getting some data from Fabric

How?  Direct Lake?  SQL Endpoint?

 

I merge both the tables on Key column

Any particular reason why you don't let the data model do the work?

 

Can you guarantee that your date(...) columns are always filled?  Consider using the Power Query equivalent of COALESCE()   ( ie, ?? )

SQL endpoint. I've never really considered the possibility of data model doing the work (I also don't know exactly weeks that is to be quite honest).

And yes, my SQL query in Power Query filters for records where datetime column is not null

Can you show a sanitized version of the query?

Yes of course:

  1. First table - Stores the Case Number (key) and current resolution datetime (datetime column)
    • SELECT caseNumber, currentResolution FROM Table1 WHERE currentResolution IS NOT NULL
  2. Second table - Stores the Case Number (key) and all resolution datetimes for a certain case in a JSON column (since a case can have multiple resolution datetimes)
    • SELECT caseNumber, JSONcolumn FROM Table2

Then I transform the JSONcolumn to get to all the resolution times for a certain case, group by case to find the earliest resolution datetime, and merge this on caseNumber with the first table. Then I filter for rows where earliest resolution datetime = currentResolution.

As I write this out, I think as a workaround I can perform an inner join where caseNumber from table 1 = caseNumber from table 2 AND earliest resolution datetime = currentResolution. I'll try to make that work and see where we end up.
 

sorry, I meant the Power Query.

Here it is:
Table 1

let
    Source = Sql.Database("QUERY FROM DATA SOURCE - TABLE 1"),
    #"Added Custom2" = Table.AddColumn(Source, "case_create_local_time", each //Do this to adjust the case create local date if received after business hours
if Time.From([case_create_local_time_raw])>=#time(16,30,0) then #datetime(
    Date.Year(Date.AddDays([case_create_local_time_raw], 1)), 
            Date.Month(Date.AddDays([case_create_local_time_raw], 1)), 
            Date.Day(Date.AddDays([case_create_local_time_raw], 1)), 
            0, 0, 0
) else [case_create_local_time_raw]),
    #"Merged Queries1" = Table.NestedJoin(#"Added Custom2", {"incidentid", "case_resolution_local"}, ResolutionCaseHistory, {"incidentid", "EarliestResolution"}, "ResolutionCaseHistory", JoinKind.Inner),
    #"Removed Columns1" = Table.RemoveColumns(#"Merged Queries1",{"case_create_local_time_raw"}),
    #"Expanded ResolutionCaseHistory" = Table.ExpandTableColumn(#"Removed Columns1", "ResolutionCaseHistory", {"EarliestResolution"}, {"EarliestResolution"}),
    #"Changed Type" = Table.TransformColumnTypes(#"Removed Columns",{{"case_create_local_time", type date}, {"EarliestResolution", type date}, {"case_resolution_local", type date}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Duration", each Duration.From([EarliestResolution]-[case_create_local_time])),
    #"Changed Type1" = Table.TransformColumnTypes(#"Added Custom",{{"Duration", type duration}}),
    #"Invoked Custom Function" = Table.AddColumn(#"Changed Type1", "BusinessDays", each BusinessDaysFunc([case_create_local_time], [EarliestResolution], HolidayTable[Date])),
    #"Changed Type2" = Table.TransformColumnTypes(#"Invoked Custom Function",{{"BusinessDays", Int64.Type}}), 
    
in
    #"Changed Type2"

 Merging is happening at "Merged Queries1"

Here is Table 2:

 

let
    Source = Sql.Database("QUERY FROM DATA SOURCE - TABLE 2"),
    #"Parsed JSON" = Table.TransformColumns(Source,{{"changedata", Json.Document}}),
    #"Expanded changedata" = Table.ExpandRecordColumn(#"Parsed JSON", "changedata", {"changedAttributes"}, {"changedAttributes"}),
    #"Expanded changedAttributes" = Table.ExpandListColumn(#"Expanded changedata", "changedAttributes"),
    #"Expanded changedAttributes1" = Table.ExpandRecordColumn(#"Expanded changedAttributes", "changedAttributes", {"logicalName", "oldValue", "newValue"}, {"logicalName", "oldValue", "newValue"}),
    #"Filtered Rows" = Table.SelectRows(#"Expanded changedAttributes1", each ([logicalName] = "adx_resolutiondate")),
    #"Changed Type" = Table.TransformColumnTypes(#"Filtered Rows",{{"newValue", type datetime}}),
    #"Filtered Rows1" = Table.SelectRows(#"Changed Type", each [newValue] <> null),
    #"Changed Type4" = Table.TransformColumnTypes(#"Filtered Rows1",{{"newValue", type datetimezone}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"incidentid"}, {{"EarliestResolution", each List.Min([newValue]), type nullable datetimezone}}),
    #"Changed Type5" = Table.TransformColumnTypes(#"Grouped Rows",{{"EarliestResolution", type datetimezone}}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Changed Type5",{{"EarliestResolution", type text}}),//Doing the following to get time in current zone
    #"Added Custom" = Table.AddColumn(#"Changed Type1", "DateTimePart", each Text.BeforeDelimiter([EarliestResolution], " -")),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "OffsetPart", each Text.AfterDelimiter([EarliestResolution]," -")),
    #"Split Column by Delimiter1" = Table.SplitColumn(#"Added Custom1", "OffsetPart", Splitter.SplitTextByDelimiter(":", QuoteStyle.Csv), {"OffsetPart.1", "OffsetPart.2"}),
    #"Changed Type2" = Table.TransformColumnTypes(#"Split Column by Delimiter1",{{"DateTimePart", type datetime}, {"OffsetPart.1", Int64.Type}}),
    #"Added Custom2" = Table.AddColumn(#"Changed Type2", "OffsetAsHours", each #duration(0,-[OffsetPart.1],0,0)),
    #"Added Custom3" = Table.AddColumn(#"Added Custom2", "EarliestResolution_Local", each [DateTimePart]+[OffsetAsHours]),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom3",{"EarliestResolution", "DateTimePart", "OffsetPart.1", "OffsetPart.2", "OffsetAsHours"}),
    #"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"EarliestResolution_Local", "EarliestResolution"}}),
    #"Changed Type3" = Table.TransformColumnTypes(#"Renamed Columns",{{"EarliestResolution", type datetime}})
in
    #"Changed Type3"

 

You may want to use #duration in your first query.

  #"Added Custom2" = Table.AddColumn(Source, "case_create_local_time", each //Do this to adjust the case create local date if received after business hours
if Time.From([case_create_local_time_raw])>=#time(16,30,0) 
 then DateTime.From(Date.From([case_create_local_time_raw]+#duration(1,0,0,0)))
 else [case_create_local_time_raw]),

 

What should happen if [case_create_local_time_raw] is null ?

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.