March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Solved! Go to Solution.
@pbiuser10 I would spend my time getting to know and understand dataflows as that is the future. But that opinion doesn't mean anything if you aren't using the cloud, do all your work in SQL on premises, etc. Then learn SSIS.
Dataflows will open doors for the organization above and beyond what SSIS does in terms of centrally locating important data, based on all the M queries and ETL code you write in your Power BI files, and support all the latest and greatest MSFT offers with new features. SSIS is an old standby and really only way you could do ETL, things are changing, if you can - I suggest focusing on dataflows.
@pbiuser10 I would spend my time getting to know and understand dataflows as that is the future. But that opinion doesn't mean anything if you aren't using the cloud, do all your work in SQL on premises, etc. Then learn SSIS.
Dataflows will open doors for the organization above and beyond what SSIS does in terms of centrally locating important data, based on all the M queries and ETL code you write in your Power BI files, and support all the latest and greatest MSFT offers with new features. SSIS is an old standby and really only way you could do ETL, things are changing, if you can - I suggest focusing on dataflows.
Sorry I couldn't disagree more. If you have a few dozen sssis packages then ssis integration runtime will be more expensive after that dataflows becomes expontially more expensive. With a few hundred ssis packages dataflows is 2-3 times more expensive that leaving an ssis integration runtime on all the time. What Microsoft is not telling anyone is that once you use a dataflow to spin up the spark cluster and reuse it for the next dataflows you have to take into the price calculation the cosumption cost of each dataflow and the total time the spark cluster was on. In essence you are paying to have the cluster on and everytime you use it, thus double paying.
Seth - awesome answer and quite aligned with what I suspected. We are a mixture of cloud & on-prem, so DF seems like the right choice.
BTW, grew up in Green Bay (nowhere near there anymore)...but Go Bucks!
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
37 | |
22 | |
20 | |
10 | |
9 |
User | Count |
---|---|
60 | |
56 | |
22 | |
14 | |
12 |