Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Next up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now

Reply
Anonymous
Not applicable

'Group by' refreshes data

I get data from 6 different data sources and put them in 6 tables. I then combine (Table.combine) the 6 tables. They all have a specific column common. When I try to do 'Group by' on that column, the data from the 6 sources refreshes again (with the first refresh at the time of combining the data). This makes the whole refresh very slow for the combined table.
Is is possible to set that it doesn't refresh the data the second time (Group by)? Please help.

4 REPLIES 4
Anonymous
Not applicable

HI @Anonymous 

 

Does this relate to what you experience in PQ Editor or this is when you refresh the dataset after it is exported to the Designer?

In PQ Editor this may be a norm as it reads and refreshes underlying datasets when it thinks it needs it.

 

I tested the Table.Combine()  behaviour and it looks like it reads every data source just once, assuming none of the underlying tables is exposed/returned from the Editor. If you return the underlying tables as well as the combined table, it will read the data source twice: the first time for the table and then for the combined table (unless you are doing this in PBI Dataflow). Could you please check with the view of the above?

 

Kind regards,

JB

Anonymous
Not applicable

Thanks @Anonymous,

 

It's not the Table.Combine, but Table.Group that reads the data the second time. I am talking about the refresh in the Desktop, not the PQ Editor itself. So once the query has applied changes, I want to refresh the data daily. When I select the combined table and try to refresh, it refreshes all the 6 tables twice, once at Table.Combine and then at Table.Group step.

Anonymous
Not applicable

Hi mate,

Sounds a bit strange. I tested grouping too, and it did not produce an additional data import from files, therefore I decided not to update my reply.

Try Table.Buffer () each imported table before using in Table.Combine. But if they are large enough to create a substantial delay when loading it may make the overall execution time longer even if it helps reducing the number of loads.

For my knowledge, how do you trace the execution? As far as I know PBI does not have a debugger and it's virtually impossible to tell what happens at what stage. I will do more testing, so it also would be great to know what is the type of your data source? Excel files?

Thanks
JB
Anonymous
Not applicable

Thank you @Anonymous for the help.

The data is a mix of SQL table, excel file and API retrieved. Each of the 6 tables has around 30k around. When I combine and do a Groupby, it gives 50k unique entries.

I didn't try using Table.buffer because the first step in this 'combined' (7th) table is the Table.combine. So the data from the individual tables is not in the context of this combined table.

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Power BI Update Carousel

Power BI Community Update - March 2026

Check out the March 2026 Power BI update to learn about new features.