Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
I have an import model published to the PBI cloud. I'm connecting another local model to it via DQ/Live connect and I've noticed a large change in performance. In the screenshot below...you can see the same exact measure being pulled in DAX studio for comparison. The screen on the right is the local import model. The left is the same measure. Is this expected or am I missing a way to optimize? I also noticed the FE and SE flipped as far as which was doing all of the work.
I'm connecting another local model to it via DQ/Live connect
There be dragons. Check the cardinality of the join column. Look at the query that is produced by the join. You may find that your query text (just the text, not the data) balloons to hundreds of megabytes.
@lbendlin In this measure....I'm just doing a distinct count against the fact table and filtering 2 DIM tables. 100% of my joins between the DIM and Fact tables are 1 to many. Both DIM tables are ~7 rows...fact table is about 10M. I'm also using a true calendar table for any date references. It runs super fast and efficient in the main model...but just loses efficiency in the live connection for some reason. Can you explain 'query text - just the text'? Are you referencing the Dax code? If so...it's just Calculate(DistinctCount(fact_ID),not fact_type in {'X','XA'})
That's what you would think. In fact the query can contain a full enumeration of all your fact values for the join column. 10M values can result in a looooong query.
Start reading from here Use composite models in Power BI Desktop - Power BI | Microsoft Learn but also check the other blogs on that topic.
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.