Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
db042190
Post Prodigy
Post Prodigy

totals dont match

Hi we have a production power bi report in the service whose FYTD total $ across 13 facilities is $xxx,xxx,065 .

when i refresh its semantic model in the sevice , i see the same.

when i download its semantic model i see the same.

when i refresh the latter on the desktop i see $xxx,xxx,122.

in my developed clone of this report i see $xxx,xxx,122 on both the desktop and service.

when i refresh the semantic model of the clone the total remains $xxx,xxx,122.

when i simulate all of this in t-sql, the total comes out to $xxx,xxx,052.

when i full outer join in ssms my simulation agaist how i believe the pbi semantic model runs, i get a perfect match.

the $'s are in a float column.   i played a little to see if converting them to decimal with about 8 or 9 decimal places could shed some light but it didnt.

Does the community have any ideas?   i looked for duplicates on invoice numbers etc in ssms in a simulated query but nothing came up.

the "tables" involved in the pbi model are Dates, Revenue, Customer and a FYTD Date Table.   I'll look tomorrow at their joins to see if there is any way a dupe can be introduced or a record dropped.   

I should have mentioned that the semantic model for the sql relational sources doesnt change after 4am when our etl that started at 3am finishes.   Sorry about that.   Im going to run some comparisons today.   And maybe poke around pbi for a record count eventhought not all records will fit into whatever.   As i recall , there is a feature somewhere in the desktop that offers some free profile info.   Its been a while.   If the record count in pbi is different from ssms simulated, i'll know im not wasting time going down this investigation path.  

1 ACCEPTED SOLUTION

i removed the one FYTD table from the model.   I converted floats to decimal (18,6).    I moved some calc responsibility away from DAX into the sql views.   I left some applied sort , rename and maybe dupe steps deleted.   I removed one critical flag that was probably working but now cant even get to pbi thanks to a sql predicate blocking it. 

 

All numbers match now not only to our warehouse $'s, but between the desktop and service as well.  Hard to explain (rounding errors) to the user when the main focus was on making the report smarter on the 1st of the month.   The report always tried to report thru "Yesterday".   With previous authors so fixed on the Today() function, that created some challenges.   

View solution in original post

14 REPLIES 14
v-karpurapud
Community Support
Community Support

Hi @db042190 

I wanted to check if you have any further questions, please let us know. Has your issue been resolved? If not, please share more details so we can assist you further.

Thank You.

v-karpurapud
Community Support
Community Support

Hi @db042190 

We have not yet received a response regarding your query. Could you please confirm if your issue has been resolved? If not, kindly provide additional details so we can offer further assistance.

Thank you.

db042190
Post Prodigy
Post Prodigy

i didnt play with the float to decimal conversion recommendation YET, but i did scour everything for possible join issues using ssms as a guage.   And dont see anything wrong but need to understand cross filtering a bit to rule that out.   

 

an old report in SSRS nailed the totals almost to the penny compared with simulated (what i think pbi should be doing) t-sql with the same underlying float values.    And the simulated t-sql reconciled to the penny against simple sums over the core fact table without possible interference from joins. 

 

can the community tell me if the following model looks like one of those situations where something passing thru from one table is affecting results on the other side or overall?  I forget what its called and admittedly never read a good explantion.   I think it has to do with cross filtering.   The cross filter direction on each relationship is single.   The Dates table is marked as a date table.  I can probably get rid of the FYTD Date table by blending it with the other date table (i inherited this whole thing).   And i may transfer some responsibility away from dax and into t-sql to get this under control.  

 

the aggregations in here summarize to maybe 24.5k records (originating in 2-3 million) so i find it diffficult to believe that rounding is going to explain this but i'll convert to decimal anyway to level the playing field. 

 

readytogiveup.png 

i removed the one FYTD table from the model.   I converted floats to decimal (18,6).    I moved some calc responsibility away from DAX into the sql views.   I left some applied sort , rename and maybe dupe steps deleted.   I removed one critical flag that was probably working but now cant even get to pbi thanks to a sql predicate blocking it. 

 

All numbers match now not only to our warehouse $'s, but between the desktop and service as well.  Hard to explain (rounding errors) to the user when the main focus was on making the report smarter on the 1st of the month.   The report always tried to report thru "Yesterday".   With previous authors so fixed on the Today() function, that created some challenges.   

Hi @db042190 

Thank you for the detailed updates and for sharing your findings. Could you please confirm whether the original issue with the FYTD totals has now been fully resolved?

Regards,
Karpurapu D.

db042190
Post Prodigy
Post Prodigy

hi all , i reset filters in the service for prod and today's FYTD yyy,yyy,700 didnt change.   

i also dropped down bookmaks in the service and see neither personal ones nor "more" bookmarks.   

i downloaded the prod semantic model and FYTD shows as yyy,yyy,700.

i refreshed the downloaded prod pbix and FYTD shows as yyy,yyy,753.   

Todays prod Semantic model  in the service refreshed at 7:36 am.    Its 10:13 am right now.

i refreshed the semantic model in the service, then jumped right to the report and still see yyy,yyy,700.  

i hit the refresh button in the service for prod left of comment and FYTD shows yyy,yyy,699 which to me is essentially the same as yyy,yyy,700.

i hit the refresh on the browser in the service  for prod and FYTD remains yyy,yyy,699.

In sql, i see the right value should be yyy,yyy,683.

In the clone in the service i see yyy,yyy,698. 

In the clone on the desktop after hitting refresh, I see yyy,yyy,753.

in an old ssrs report i see yyy,yyy,683.

While both prod and the clone seem to have an issue, possibly the same one, i'll focus on getting the clone straightened out.  It does worry me that neither's values remain the same aftre refresh on the desktop.

 

i saw an insights button in the sevice and as i recall there is probably one on the desktop too.   hopefully it will give me a count of records even though the # will be north of 1.6 million records.  

 

 

 

Hi @db042190 

To address the inconsistent FYTD totals observed between Power BI Service, Desktop, and SQL, it is necessary to review the use of the float data type for currency values. Since floats are approximate, they can introduce rounding errors that accumulate during aggregation, especially in large datasets. This can cause totals to vary slightly with each refresh, even if the underlying data remains unchanged. Converting the float column to a fixed-precision type like decimal(18,6) at the SQL source or within Power BI Power Query will help ensure accurate and consistent calculations across all platforms.

Additionally, please review any Power Query steps that remove duplicates. If these are not based on stable business keys, they may result in inconsistent row counts depending on refresh timing or data order. Refining this logic will help prevent unintended data loss or duplication. Comparing record counts and analyzing totals by facility or customer can further pinpoint where discrepancies arise. By ensuring proper data type usage and transformation steps, FYTD totals will remain consistent and reliable across Power BI and SQL.

Regards,
Karpurapu D,
Microsoft Fabric Community Support Team.

HarishKM
Memorable Member
Memorable Member

@db042190 Hey,

Step 1: Check Import and Refresh Process: Ensure that the data refresh process on both the service and desktop includes all necessary tables and views. Double-check any filters applied during import.

Steps 2: Validate Relationships and Joins: Review the relationships in the Power BI model and the join logic in SQL. Ensure they align precisely, focusing on keys used for joins. Consider possible cardinality issues or missing join conditions.

Steps 3: Investigate Calculated Measures: If the semantic model includes calculated measures or columns, analyze them to ensure consistent logic. Check if measures apply correctly across all environments.

 

Thanks
Harish M

Kindly accept this as a solution if it resolves your problem, and please consider giving kudos to the post as well.

Thx Harish, thats pretty much what i'll try today.   For full transparency, i should also have mentioned i made significant changes to dax based items in the clone which is intended to replace the original when the dust settles...mostly because the original author used Today() a lot in a report that reports on sales thru yesterday.   

 

But...the mystery surrounding how a refresh on the desktop made the downloaded original show the same numbers as the clone on a semantic model that is static after 4 am kind of remains the 3 ton elephant in the room.   That baffles me.   I'll report back here what i find.    Its like a needle in a haystack.   Also, i should have mentioned the original author had (or maybe has) an applied step that eliminates dupes.  I'll take  alook at that too.   bad practice in my opinion but maybe i'll get some insight.     

v-karpurapud
Community Support
Community Support

Hi @db042190 

Thank you for reaching out to the Microsoft Fabric Community Forum and thank you @rohit1991 and @Greg_Deckler for sharing the insights.

 

The difference in FYTD totals between Power BI Service and Desktop is likely caused by a persistent filter or an automatic bookmark in the Power BI Service. Persistent filters and bookmarks in Power BI save your filter, slicer, and visual selections across sessions, but these do not apply in Power BI Desktop, which can lead to mismatched totals. To fix this, try clicking the "Reset to default" icon in the report header on Power BI Service to clear all filters and return the report to its original state. Also, check any bookmarks especially those set as default or used for navigation to make sure they are not applying filters you don’t intend. If you don’t need persistent filters, consider turning off this feature in the report settings to keep results consistent.

Regards,
Karpurapu D,
Microsoft Fabric Community Support Team.

thx karpurapud, i'll poke around to see.    There is one filter that must be on in this report and i believe we removed the ability (maybe thru hidden) for users to change that.   Its an inter company flag.   There is also a facility filter for which choosing other than all would be at the discretion of the user. 

 

I never used bookmarks and we only have pro licenses (which maybe precludes using them?)  but i'll poke around to see what they are and if we use them.  I've read about them before but its been a while.

rohit1991
Super User
Super User

Hi @db042190 

Hi please keep a note of the following points:

  • Check relationships in Power BI, especially for duplicates or incorrect join types.

  • Compare filters and slicers used in each model/report.

  • Review calculated columns/measures for logic differences.

  • Verify data source and refresh settings, make sure all are pulling from the same data and updated.

  • Look for Row-Level Security (RLS) differences between environments.

  • Check for hidden duplicates or data being counted twice in the model.

  • Check float/decimal handling but usually this isn’t a large discrepancy.

Break down the total by facility/customer to see where numbers diverge. Usually, it's a relationship or filter context causing this kind of mismatch.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

thx rohit.   I'm poking around today.   Luckily RLS isnt in our environment just yet.   Data sources should be identical.    Im suspecting both reports are wrong by a few $.   I have an old ssrs report that the prod pbi report replaced and i have a feeling its going to be very useful in both this mystery and also verifying my dax fixes in the clone to the short sighted use of Today() for a report that reports thru yesterday .   When Today() is the 1st of the month, nothing shows.   We've known about this shortcoming for a long time.   I finally went down the rabbit hole of fixing it.

Greg_Deckler
Community Champion
Community Champion

@db042190 The original discrepancy between the service and the desktop could have been due to an automatic bookmark in the service. There is an icon to reset the report filters over on the right in the header. 



Follow on LinkedIn
@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
DAX For Humans

DAX is easy, CALCULATE makes DAX hard...

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors