Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hi All
We have Power BI Premium capacity, I wanted to create a dataflow to push a large fact table with 2 billion rows with incremental refresh along with its dimension tables. So multiple power BI developers can plug their report into Dataflows.
Is there any maximum no of rows cap in dataflows ?
Also I Would like to know the best practices for Dataflows and to handle Large datasets in Power BI Premium.
The issue I currently have is, The fact and dim table sites on the Synapse dedicated pool, but the performance is bad when we do direct query mode from Power BI Desktop, so we end up creating aggregated table at different granularity and some materialised views.
To reducing these housekeeping works and since the end user requirement change over the period of time, I think of pushing the whole table into Power BI Premium - Dataflow service and turn on the Enhanced complute engine so the report developers can point it to this dataflow (using direct query mode) . Is this a best practice ?
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.