Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi guys,
I am currently working on a project for a client. They have an Azure SQL DW, which stores data from numerous censors in a building they own. We are talking about several 100 million rows. When I try to set up a direct query connection to this DW, either from PBI Service or the desktop, the query time is terrible. I can easily connect to the DW, select tables and even apply filters in the query editor. But once I start dragging measures and columns to the canvas in my workspace, the load time is ENDLESS or tables/charts fail.
My issue is whether PBI is simply not fit to handle such a large amount of data, or whether indexing the rows or pumping up performance in the DW would have any effect.
Do any of you have any experience regarding this matter and can you perhaps share some light on the issue?
Thanks,
Casper
I've had a similar issue and have started just importing the data (which I can do since my data warehouse isn't as massive as yours, at least not yet).
Have you found a solution?
Hi,
I think performance are also realted to your database subscription.
I advise you to try swithching on a premium database.
Hi @CNR @Spicer! I think for this scenario clustered Columnstore indexes - key.
https://msdn.microsoft.com/en-us/library/dn817827.aspx?f=255&MSPPError=-2147217396
https://channel9.msdn.com/Events/DataDriven/SQLServer2016/Real-Time-Operational-analytics
I have the same issue, I was forced to used import data instead
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.