Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
Anonymous
Not applicable

'Group by' refreshes data

I get data from 6 different data sources and put them in 6 tables. I then combine (Table.combine) the 6 tables. They all have a specific column common. When I try to do 'Group by' on that column, the data from the 6 sources refreshes again (with the first refresh at the time of combining the data). This makes the whole refresh very slow for the combined table.
Is is possible to set that it doesn't refresh the data the second time (Group by)? Please help.

4 REPLIES 4
Anonymous
Not applicable

HI @Anonymous 

 

Does this relate to what you experience in PQ Editor or this is when you refresh the dataset after it is exported to the Designer?

In PQ Editor this may be a norm as it reads and refreshes underlying datasets when it thinks it needs it.

 

I tested the Table.Combine()  behaviour and it looks like it reads every data source just once, assuming none of the underlying tables is exposed/returned from the Editor. If you return the underlying tables as well as the combined table, it will read the data source twice: the first time for the table and then for the combined table (unless you are doing this in PBI Dataflow). Could you please check with the view of the above?

 

Kind regards,

JB

Anonymous
Not applicable

Thanks @Anonymous,

 

It's not the Table.Combine, but Table.Group that reads the data the second time. I am talking about the refresh in the Desktop, not the PQ Editor itself. So once the query has applied changes, I want to refresh the data daily. When I select the combined table and try to refresh, it refreshes all the 6 tables twice, once at Table.Combine and then at Table.Group step.

Anonymous
Not applicable

Hi mate,

Sounds a bit strange. I tested grouping too, and it did not produce an additional data import from files, therefore I decided not to update my reply.

Try Table.Buffer () each imported table before using in Table.Combine. But if they are large enough to create a substantial delay when loading it may make the overall execution time longer even if it helps reducing the number of loads.

For my knowledge, how do you trace the execution? As far as I know PBI does not have a debugger and it's virtually impossible to tell what happens at what stage. I will do more testing, so it also would be great to know what is the type of your data source? Excel files?

Thanks
JB
Anonymous
Not applicable

Thank you @Anonymous for the help.

The data is a mix of SQL table, excel file and API retrieved. Each of the 6 tables has around 30k around. When I combine and do a Groupby, it gives 50k unique entries.

I didn't try using Table.buffer because the first step in this 'combined' (7th) table is the Table.combine. So the data from the individual tables is not in the context of this combined table.

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.