Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
I thought there was a blog from Chris about this in the past but I can't find it.
I'm often playing the role of a normal user, who is working with Excel pivot tables and I'm connected to tabular model in Power BI. In this role, there are times where I may need to have a local/offline copy of the data. For example, I may be getting on an airplane, and I want to keep a copy of the data for the flight and continue to edit my Excel workbooks.
Similarly, I may fly from Chicago to Europe, and the latencies to connect to my datasets may become prohibitive. (eg. if my datasets are hosted in the North Central region, and I'm trying to use them elsewhere).
The use of remote Power BI datasets become *very* frustrating, when Excel latencies are greater than ~20ms or so. At about ~10ms users will start to notice an inferior experience in their pivot tables, but after about ~50ms the latencies become so great that pivot tables in Excel will become almost unusable. As such, there must be a way to move our datasets down to local memory in Excel (maybe as a power pivot data model). This would allow users to continue working productively even when they are traveling or are working in a remote location that is 20ms away from the Azure region.
Can anyone share the best way to move a model onto the local machine? There would always be a filter involved, and a reasonable threshold (10 or 20 GB model size, for example). As-of now the only solution I can think of is to use Power Query / Power Pivot and build an entirely new model from scratch which imports from the Power BI datast. This seems like a very unreasonable strategy. It creates duplication, and will be labor-intensive in the long run, as the source model changes over time.
Any tips would be appreciated. In the days of "multidimensional" models, I believe they used the term "local cubes" when referring to this type of a feature.
Hi @DallasBaba ,
Thanks for your response. This is sort of what I as implying when I said "use Power Query / Power Pivot and build an entirely new model from scratch".
There are problems with this. The biggest problem is that we have to continue to maintain this new local version of the cube in the long term and it will diverge from the original. What we really need is just a local replica, with the exact same metadata, and some portion of the data as well (eg. just the current year, or whatever).
Aside from the work of managing a whole new cube, there are some other problems we encounter when building an import model in power query. These are related primarily to performance. When you import from a tabular model it is extremely slow since it runs MDX (instead of DAX) against a tabular dataset and the Power BI engine does a really bad job with the MDX syntax that is generated by PQ (normally involving lots of crossjoins). We would prefer another type of performance, like the one performance we see when using the "download this model" command in the PBI service portal.
If I had to guess, Microsoft already knows that this "offline" functionality is greatly desired by customers, but they are problably deliberately withholding it, since it would be counterproductive to the monetization of the Power BI platform. IE. they would rather that we send more queries to the service, and decrement from our available CU's, and upgrade our Fabric capacity license from F64 to F6400.
@dbeavon3 you can create a local data model in Excel using power pivot. First import the data from Power BI using power query and load into the power pivot data model by selecting load to > Add to the Data Model. This will enable you to create pivot tables based on this local data model, which will be available offline.
See Chris Menard blog post's and youtub vedio on how to Share a PivotTable without the Source Data or Underlying Data.
I hope this help.