Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello Everyone,
I am working with Orders and OrderItems model. There are 6 different individual dimension keys that make up a primary key including a date field. To effectively use the filters I am using the linking table concept (Created a Primary key in the linking table via ETL and using that table to filter the facts).
Basically my model is like,
Orders <==> OrdersDimension <==> Rest of Dimensions
OrderItems <==> OrdersDimension <==> Rest of Dimensions.
/t5/image/serverpage/image-id/3095i6752181269DACC47/image-size/large?v=1.0&px=600
Works fine, but with historical data, I have more than 2Billion unique PK records via ETL which is a limitation in Tabular.
How can I get around this limitation?
I have tried another solution like linking the dimensions to the fact tables directly. but I have to write complex DAX to propagate filter conditions to OrderItems measures.
https://stackoverflow.com/questions/10181316/2-billion-distinct-values-per-column-limitation
Basically to solve a problem like this
https://community.powerbi.com/t5/Desktop/Best-Way-for-work-with-Multiple-Fact-Tables/td-p/21441
I saw an answer provided in this solution but I have no clue how to merge?
Any suggestions are appreciated. Thanks in advance.
Thank you @GilbertQ . At this moment that is one of the options I am considering. I am also trying to understand if there is any other work around for this situation meanwhile.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 40 | |
| 37 | |
| 33 | |
| 29 | |
| 27 |
| User | Count |
|---|---|
| 133 | |
| 104 | |
| 61 | |
| 59 | |
| 55 |