Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
I'm working on a shared semantic model solution via a dataflow. Initial set up of the dataflow and basic model went well but now I need to edit the model, add extra data plus measures etc I'm having issues trying to download and/or publish it.
As pro licence holders only I don't have the option for direct query to the dataflow. So my dataflow and model are importing the data. The source data connection in the dataflow is a cloud connection to postgreSQL database.and it takes 40mins to refresh overnight. Then the published model takes 20mins to refresh. The published model size is 680MB.
When I tried to download the model to edit it was taking a long time so I decided to just keep a copy of it on sharepoint, edit that and republish as needed. But trying to publish it is also very slow.
Is there a way around this? For example, a relatively simple way to build/edit a model based on a limited data sample then and expand to the full dataset when published?
I thought I'd ask as usually my intuition is way out on PBI and there is a really simple solution that I'm just clueless about.
Thanks for helping!
Hi @klev28
What I would do is instead of using the data flow I would create all the data in my power bi desktop using Power query. In this way it is a single refresh and any changes can easily be made in the power bi desktop file.