Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello Everyone,
I am working with Orders and OrderItems model. There are 6 different individual dimension keys that make up a primary key including a date field. To effectively use the filters I am using the linking table concept (Created a Primary key in the linking table via ETL and using that table to filter the facts).
Basically my model is like,
Orders <==> OrdersDimension <==> Rest of Dimensions
OrderItems <==> OrdersDimension <==> Rest of Dimensions.
/t5/image/serverpage/image-id/3095i6752181269DACC47/image-size/large?v=1.0&px=600
Works fine, but with historical data, I have more than 2Billion unique PK records via ETL which is a limitation in Tabular.
How can I get around this limitation?
I have tried another solution like linking the dimensions to the fact tables directly. but I have to write complex DAX to propagate filter conditions to OrderItems measures.
https://stackoverflow.com/questions/10181316/2-billion-distinct-values-per-column-limitation
Basically to solve a problem like this
https://community.powerbi.com/t5/Desktop/Best-Way-for-work-with-Multiple-Fact-Tables/td-p/21441
I saw an answer provided in this solution but I have no clue how to merge?
Any suggestions are appreciated. Thanks in advance.
Thank you @GilbertQ . At this moment that is one of the options I am considering. I am also trying to understand if there is any other work around for this situation meanwhile.
Check out the July 2025 Power BI update to learn about new features.
User | Count |
---|---|
73 | |
71 | |
38 | |
29 | |
28 |
User | Count |
---|---|
99 | |
88 | |
62 | |
42 | |
39 |