Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes! Register now.
Hello all, I was hoping to receive your feedback regarding the existing PBI Service architecture that we have implemented. Currently, this is how our data moves from source to PBI Reports: on-prem SQL Server => SQL Job streams performing ETL => on-prem SQL Server data warehouse => Fabric Gen1 Dataflows => Fabric Semantic Models => PBI Report. Part of our reasoning behind this architecture were bandwidth limitations on our LAN and data gateway server limitations. Namely, large semantic model refreshes were taxing our on-prem data gateway to the max. However, as I become more familiar with various fabric entities, I am now wondering if this architecture was either not an ideal implementation or if it can be improved upon. Considering our entire ETL process is managed through several SQL stored procedures, and loaded into a report ready data warehouse, are dataflows the best Fabric entity to use in this context? Our team is relatively new to Fabric and Power BI, so as we learn, we are hoping to learn from the veterans and build a more robust and effective architecture. Thank you!
Solved! Go to Solution.
As you mentioned bandwidth being one of your limitations, it would be highly recommended if you could get your data into the cloud, because then your bandwidth limitation will be removed as all the data processing will happen within the cloud. You could keep doing what you are currently doing with data flows and that will work well, I guess the details around fabric items would be your knowledge around. lakehouses, warehouses, notebooks, etc.
Hi @goodbyetonto,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If our responses has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @goodbyetonto ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
As you mentioned bandwidth being one of your limitations, it would be highly recommended if you could get your data into the cloud, because then your bandwidth limitation will be removed as all the data processing will happen within the cloud. You could keep doing what you are currently doing with data flows and that will work well, I guess the details around fabric items would be your knowledge around. lakehouses, warehouses, notebooks, etc.
Thank you @GilbertQ. Would you recommend moving existing Gen1 dataflows to Gen2 considering we will likely continue to leverage them in our environment?
If you're only avail in a consumer into a semantic model, I would suggest dataflow gen 1. Otherwise, if you want to use the output or other things, then use Dataflow Gen2