Hi Experts,
We have data volume of >10M records in Azure Synapse, what is the best performant approach to push such a huge volume of data from Azure to Power BI. Post initial load we need to perform incremental data refresh from Azure to Power BI.
1) What is the connection mode we need to use?
2) Star schema in Power BI is performant while handling such huge volume of data?
Please show some light over to this.
We can use direct query but every time it will transact the backend hence we don't want to do that. We have models which is having more than 100 columns and data volume provided is just an example actual volume is more than 10M.
Why do you need to push data from one Microsoft product to another? Can you not leave the data at the source and access with direct query?
10M rows is not "huge volume", it's more on the small side. Huge starts when the dataset size is bigger than half the capacity memory.
Join us for a free, hands-on Microsoft workshop led by women trainers for women where you will learn how to build a Dashboard in a Day!