Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
I have 16 dataflows that are exactly the same - they contain the same transformation steps when bringing in excel files from SharePoint, the only difference being that the dataflows filter to different folders. This was necessary when the dataflows were initially set up because of the sheer amount of data that is being pulled in.
I have found a more effective way to arrange these dataflows where I can now do the same with 5 dataflows instead of 16.
However, the problem still remains that if I need to make a change to one dataflow, I need to make sure the same change is made to the other 4 dataflows.
Now I am trying to find out if I can essentially put these transformation steps into a function that can be shared with each dataflow? If functions would not be possible, is there a different solution out there that would work?
Yes on custom functions. Here is a video that may help you get started. Also a second video on doing incremental refresh of files to speed up refresh in case you are combining date-based files.
https://www.youtube.com/watch?v=6cGou1-1FOo
https://www.youtube.com/watch?v=IVMdg16yBKE
Pat
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 5 | |
| 3 | |
| 2 | |
| 2 | |
| 2 |
| User | Count |
|---|---|
| 9 | |
| 8 | |
| 7 | |
| 5 | |
| 5 |