Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
Hello,
I've connected to a table and everything was fine until I used Remove Duplicates. At this point, I get an error message of "DataSource.Error: Microsoft SQL: Return records size cannot exceed 83886080. Make sure to filter result set to tailor it to your report.". How can this be - surely at this point the record count should decrease, not increase! If I take out the Remove Duplicates step, then all is well. The code is below:
let
Source = CommonDataService.Database("ourdomain.crm999.dynamics.com"),
dbo_blc_tagactivity = Source{[Schema="dbo",Item="blc_tagactivity"]}[Data],
#"Removed Other Columns" = Table.SelectColumns(dbo_blc_tagactivity,{"actualend", "activityid", "blc_tagidname", "blc_tagtypeidname", "contact"}),
#"Expanded contact" = Table.ExpandRecordColumn(#"Removed Other Columns", "contact", {"contactid", "statuscodename"}, {"contact.contactid", "contact.statuscodename"}),
#"Removed Columns" = Table.RemoveColumns(#"Expanded contact",{"actualend", "contact.statuscodename"}),
#"Removed Duplicates" = Table.Distinct(#"Removed Columns", {"blc_tagidname", "blc_tagtypeidname", "contact.contactid"}),
#"Grouped Rows" = Table.Group(#"Removed Duplicates", {"blc_tagidname", "blc_tagtypeidname"}, {{"Total", each Table.RowCount(_), Int64.Type}}),
#"Renamed Columns" = Table.RenameColumns(#"Grouped Rows",{{"blc_tagtypeidname", "Type"}, {"blc_tagidname", "Tag"}})
in
#"Renamed Columns"
Thanks in advance for any help 🙂
Hi @s--turn
Probably it has hit the following highlighted limit of Dataverse connector. You may try the other two methods to connect to the same data source. Reference: Dataverse: Accessing large datasets
===================================================
Power BI datasets contained in Dataverse can be very large. If you're using the Power Query Dataverse connector, any specific query that accesses the dataset must return less than 80 MB of data. So you might need to query the data multiple times to access all of the data in the dataset. Using multiple queries can take a considerable amount of time to return all the data.
If you're using the Common Data Service (Legacy) connector, you can use a single query to access all of the data in the dataset. This connector works differently and returns the result in “pages” of 5 K records. Although the Common Data Service (Legacy) connector is more efficient in returning large amounts of data, it can still take a significant amount of time to return the result.
Instead of using these connectors to access large datasets, we recommend that you use Azure Synapse Link to access large datasets. Using Azure Synapse Link is even more efficient that either the Power Query Dataverse or Common Data Service (Legacy) connectors, and it is specifically designed around data integration scenarios.
===================================================
Hope this helps.
Best Regards,
Community Support Team _ Jing
If this post helps, please Accept it as Solution to help other members find it.
Further to my reply, I've just checked and there are "only" 1,184,694 rows in the original dataset, before removing duplicates. Therefore, the error message which implies that I've exceeded 83,886,080 rows cannot be true? I don't think I have more than 83m rows of anything in the entire database!
Hi, thanks for your reply. I still don't understand why this is only a problem when I remove duplicates. Does the MB count go up when the remove duplicates step is attempted?
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.