Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
I have a Power BI semantic model with incremental refresh enabled. I am using a GitHub connection for version control, and I make changes in a dev workspace connected to a new branch before merging to main. (that is connected to prod workspace). Currently, when I merge to main, data is overwritten.
How can I ensure that existing data in historical partitions remains intact when merging changes of the semantic model to production?
Hello @ua77 , with overwriting what did you mean? Partitions disappeared? Full refresh triggered? Or actual historical data is lost?
Hi,
I am experiencing both partitions that have disappeared and historical data that is lost.
When I accept incoming changes from main, all existing partitions (and their data) disappear. When I trigger a refresh of the semantic model, it starts a full refresh and rebuilds all partitions from scratch. The problem is that the full refresh takes a long time and causes downtime for the reports (until the refresh is completed).
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 3 | |
| 3 | |
| 2 | |
| 1 | |
| 1 |
| User | Count |
|---|---|
| 5 | |
| 4 | |
| 3 | |
| 3 | |
| 2 |