The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hello, I have following problem.
Let's say I have table X in .pbix that contains 10k rows. For performance purposes I'm using incremental refresh, so on service this table contains around 500k rows.
Now, I need to make a copy of table X. I don't want to duplicate X, because there is a lot of steps in there. So I chose to use reference for that and made table Y where Source = table X. In PowerBI desktop everything seems to be fine, but after publishing to service incremental refresh is not working as I expected - table X is refreshed to 500k rows, but table Y still contains only 10k rows, and I wanted them to be equal.
Am I missing something? Is reference not the best choice in my example? I'll appreciate any help.
Solved! Go to Solution.
Hello @DarSz ,
what is your data source? And how big is your file size when you save it with all the rows included.
In general 500k rows should not be a problem for a database and also not for Power BI. So I would not implement an incremental refresh until there is a specific reason why you would use that.
Incremental refresh is made when you have tens of millions or billions of rows.
If you need any help please let me know.
If I answered your question I would be happy if you could mark my post as a solution ✔️ and give it a thumbs up 👍
Best regards
Denis
Blog: WhatTheFact.bi
My data source is SQL database and my file isn't that big. Problem is with publishing to BI service; when I duplicate table X and push my pbix, refresh ends with timeout.
As for incremental I'm using it to reduce load at my database.
User | Count |
---|---|
65 | |
61 | |
60 | |
53 | |
30 |
User | Count |
---|---|
181 | |
88 | |
71 | |
48 | |
46 |