Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi, I'm running into an issue in which I was hoping someone could point me in the right direction. I am working with 2 tables - a customer table, with 38 million rows and a sales table with 18 million rows. I've connected the tables in my data model, but every time I try to create a visual, or a DAX measure, I running into this limit. All my research has found people saying "just add a filter to reduce the rows to below 1million" - but I'm wanting to view data for more than 1 million rows - I'm wanting to view all the data. For example, a simple line chart, by month, that shows number of customers who have purchased in that month - I don't want to show a subset of customers, I want to show every customer who has purchased, for each month - which would well exceed 1,000,000.
How do others handle large data sets where you are connecting tables via customer key to avoid this error? Should I be creating aggregate tables? I"m currently connected to my data through Snowflake, using direct query.
Thanks for any suggestions.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.