Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
recently I was doing some coding work in Notebook, dealing with files in SFTP server stuffs.. some manipulation and checks and sending back the file to SFTP Server. the logic itself is not as simple, and its big file size needs to be overcome.. but now everything is OK with the notebook script which I feel satisfied.
question to ask is if that Notebook script is able to be replicated with Dataflow Gen 2. To me it seems not possible, but maybe the experts here can advise better...
I know in terms of cost, Dataflow Gen 2 is more costly than the Notebook. Also need to consider its maintainability.. does Dataflow Gen 2 helps in its maintability, as it is supposed to be low-code.
Thanks.
Solved! Go to Solution.
I would suggest you to stick with notebooks for your scenario.
Use Dataflow Gen 2 if....
Sources/destinations are natively supported (OneLake, SharePoint, etc.)
The logic is simple (joins, filters, merges)
You want to expose transformations to citizen developers or analysts
If you need better orchestration or reuse then you have toconsider:
Wrapping your notebook logic in a pipeline
Using parameters and dynamic branching
Logging for better visibility
and yes you are right that dataflowgen2 is costly as it depends on the refresh frequency, size of data, and compute capacity it utilizes. It is tied more to capacity planning and licensing
On maintainablity,
notebook will be harder to onboard non developers but much more maintainable for advanced users if you plan to modularize & use clear functions with logging and comments
Please kudo and 'Mark' as answer if the reply was helpful. This will be benefitting other community members who face the same issue.
I would suggest you to stick with notebooks for your scenario.
Use Dataflow Gen 2 if....
Sources/destinations are natively supported (OneLake, SharePoint, etc.)
The logic is simple (joins, filters, merges)
You want to expose transformations to citizen developers or analysts
If you need better orchestration or reuse then you have toconsider:
Wrapping your notebook logic in a pipeline
Using parameters and dynamic branching
Logging for better visibility
and yes you are right that dataflowgen2 is costly as it depends on the refresh frequency, size of data, and compute capacity it utilizes. It is tied more to capacity planning and licensing
On maintainablity,
notebook will be harder to onboard non developers but much more maintainable for advanced users if you plan to modularize & use clear functions with logging and comments
Please kudo and 'Mark' as answer if the reply was helpful. This will be benefitting other community members who face the same issue.
User | Count |
---|---|
30 | |
15 | |
12 | |
8 | |
8 |
User | Count |
---|---|
43 | |
31 | |
25 | |
15 | |
14 |