Showing results for 
Search instead for 
Did you mean: 
Regular Visitor

Huge data volume consumption in Power BI

Hi Experts,

    We have data volume of >10M records in Azure Synapse, what is the best performant approach to push such a huge volume of data from Azure to Power BI. Post initial load we need to perform incremental data refresh from Azure to Power BI.


1) What is the connection mode we need to use?

2) Star schema in Power BI is performant while handling such huge volume of data?


Please show some light over to this.

Regular Visitor

We can use direct query but every time it will transact the backend hence we don't want to do that. We have models which is having more than 100 columns and data volume provided is just an example actual volume is more than 10M. 

Super User
Super User

Why do you need to push data from one Microsoft product to another? Can you not leave the data at the source and access with direct query?


10M rows is not "huge volume", it's more on the small side.  Huge starts when the dataset size is bigger than half the capacity memory.

Helpful resources

Exciting changes

Power BI Community Changes

Check out the changes to the Power BI Community announced at Build.

May 2023 update

Power BI May 2023 Update

Find out more about the May 2023 update.

Kudo Data Story carousel

Data Stories Gallery

Visit our Data Stories Gallery and give kudos to your favorite Data Stories.