cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
murugaanbu
Regular Visitor

Huge data volume consumption in Power BI

Hi Experts,

    We have data volume of >10M records in Azure Synapse, what is the best performant approach to push such a huge volume of data from Azure to Power BI. Post initial load we need to perform incremental data refresh from Azure to Power BI.

 

1) What is the connection mode we need to use?

2) Star schema in Power BI is performant while handling such huge volume of data?

 

Please show some light over to this.

2 REPLIES 2
murugaanbu
Regular Visitor

We can use direct query but every time it will transact the backend hence we don't want to do that. We have models which is having more than 100 columns and data volume provided is just an example actual volume is more than 10M. 

lbendlin
Super User
Super User

Why do you need to push data from one Microsoft product to another? Can you not leave the data at the source and access with direct query?

 

10M rows is not "huge volume", it's more on the small side.  Huge starts when the dataset size is bigger than half the capacity memory.

Helpful resources

Announcements
PBI Sept Update Carousel

Power BI September 2023 Update

Take a look at the September 2023 Power BI update to learn more.

Learn Live

Learn Live: Event Series

Join Microsoft Reactor and learn from developers.

Dashboard in a day with date

Exclusive opportunity for Women!

Join us for a free, hands-on Microsoft workshop led by women trainers for women where you will learn how to build a Dashboard in a Day!

MPPC 2023 PBI Carousel

Power Platform Conference-Power BI and Fabric Sessions

Join us Oct 1 - 6 in Las Vegas for the Microsoft Power Platform Conference.

Top Solution Authors
Top Kudoed Authors