Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hello. When I refresh my report in PBI desktop I get this error because the data in the test area of Snowflake contains duplicates.
If I publish the un-refreshed report to the service and refresh it there, I don't get any errors. Does a refresh in the service not detect model errors due to data problems?
Thanks.
Hello @Newcolator,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @Newcolator,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Do you know that when create a new PBIX in Power BI Desktop and publish to Power Bio Service that is creates 2 objects. The Dataset (semantic datamodel) and a the Report (pages of visuals).
Possible reasons why you are getting different results :-
1) In Power BI Desktop aply the changes and preview all, then apply changes and refresh.
2) You have not checked the dataset refersh history, which will show the error.
3)The error you are getting is with the Dataset and not the report. So if you are just publishing the Report then you may have a different dataset on Power BI desktop and in the Power BI Service.
4) The Power BI Service dataser as paramateres settings different to you Power BI desktop.
5) Your are using a datasource with row level security. Your Power BI Desktop has credentials to the the duiplicate but your Power BI Service gateway credentials do not.
Hope these help.
Click, the thumbs up for trying.
Click accept solution if any work.
Thank you
Thanks for that. So I'm guessing there's no way to make the refresh in the service pick up on the error and fail?
Hi @Newcolator,
Thank you for reaching out to the Microsoft fabric community forum.
Unfortunately, no, there’s no built-in way to make the Power BI Service fail a dataset refresh due to model-level issues like duplicate keys unless those errors cause actual query or model breaking behaviour during refresh time.
Please consider below workarounds:
Thank you, @speedramps & @ajaybabuinturi for sharing valuable insights.
If this information is helpful, please “Accept as solution” and give a "kudos" to assist other community members in resolving similar issues more efficiently.
Thank you.
Hi @Newcolator,
I hope the information provided has been useful. Please let me know if you need further clarification or would like to continue the discussion.
If your question has been answered, please “Accept as Solution” and Give “Kudos” so others with similar issues can easily find the resolution.
Thank you.
Hi @Newcolator,
Power BI Desktop validates relationships and data integrity strictly when you refresh your model. If the column on the one-side of a many-to-one relationship (or a primary key column) contains duplicates, it immediately throws an error, as it's breaking the relationship rules.
In your case:
But your Snowflake test data has duplicates like '43924319'.
Why Power BI Service doesn’t throw the same error
Power BI Service does not always enforce the same validation logic at refresh time as Power BI Desktop. This discrepancy can be due to:
Important: Even though it doesn’t fail in the Service, the model can be logically incorrect — visuals depending on CLAIM_KEY might give wrong results.
You should resolve the duplicates in one of these ways
1. Filter duplicates out
In Power BI Desktop, go to Power Query (Transform Data):
= Table.Distinct(CLAIM, {"CLAIM_KEY"})
Or filter based on a condition that avoids test data.
2. Change the relationship
If duplicates in CLAIM_KEY are valid in your scenario (e.g., one claim has multiple entries), you may need to:
3. Handle test vs prod data differently
If this only happens in a test Snowflake environment, consider:
Thanks,
If you found this solution helpful, please consider giving it a Like👍 and marking it as Accepted Solution✔. This helps improve visibility for others who may be encountering/facing same questions/issues.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
50 | |
32 | |
27 | |
26 | |
25 |
User | Count |
---|---|
62 | |
49 | |
29 | |
24 | |
23 |