Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by watching the DP-600 session on-demand now through April 28th.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
Solved! Go to Solution.
@pbiuser10 I would spend my time getting to know and understand dataflows as that is the future. But that opinion doesn't mean anything if you aren't using the cloud, do all your work in SQL on premises, etc. Then learn SSIS.
Dataflows will open doors for the organization above and beyond what SSIS does in terms of centrally locating important data, based on all the M queries and ETL code you write in your Power BI files, and support all the latest and greatest MSFT offers with new features. SSIS is an old standby and really only way you could do ETL, things are changing, if you can - I suggest focusing on dataflows.
@pbiuser10 I would spend my time getting to know and understand dataflows as that is the future. But that opinion doesn't mean anything if you aren't using the cloud, do all your work in SQL on premises, etc. Then learn SSIS.
Dataflows will open doors for the organization above and beyond what SSIS does in terms of centrally locating important data, based on all the M queries and ETL code you write in your Power BI files, and support all the latest and greatest MSFT offers with new features. SSIS is an old standby and really only way you could do ETL, things are changing, if you can - I suggest focusing on dataflows.
Sorry I couldn't disagree more. If you have a few dozen sssis packages then ssis integration runtime will be more expensive after that dataflows becomes expontially more expensive. With a few hundred ssis packages dataflows is 2-3 times more expensive that leaving an ssis integration runtime on all the time. What Microsoft is not telling anyone is that once you use a dataflow to spin up the spark cluster and reuse it for the next dataflows you have to take into the price calculation the cosumption cost of each dataflow and the total time the spark cluster was on. In essence you are paying to have the cluster on and everytime you use it, thus double paying.
Seth - awesome answer and quite aligned with what I suspected. We are a mixture of cloud & on-prem, so DF seems like the right choice.
BTW, grew up in Green Bay (nowhere near there anymore)...but Go Bucks!
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
| User | Count |
|---|---|
| 9 | |
| 8 | |
| 8 | |
| 8 | |
| 7 |
| User | Count |
|---|---|
| 38 | |
| 30 | |
| 26 | |
| 22 | |
| 19 |