March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Title says it all. I have two 5 GB tables I am trying to import from a dataflow. I want to manipulate the tables in Power Query, but it takes the better part of an hour to load and apply every time I change a step. Is it possible to limit the columns before import (not Remove Columns after import) or to filter to a smaller subset of rows (perhaps by a date field - Sept 22 data only, for example?)?
Solved! Go to Solution.
That would probably work, but I have not explored the datamart feature much.
What I ended up doing is exporting the dataflow and adding a filter for the PreviousNMonths(2) to the tables I wanted to filter and set up a new QA dataflow that I only need to refresh once a week (or month, if MS gave us the option) in Notepad++. Tried doing it in the PQ Online in the PBI Service and it was taking forever, so I just edited the JSON manually. Took 15 minutes to filter 30 or so really large tables to a workable size for me to build datasets from. I can then reconnect them in Desktop PBI to the full dataflow after I have the dataset built.
PBI is such garbage. My old employer used Tableau and I miss it so much. Tableau wouldn't even blink at a 1m row table, much less need an hour to join ("merge" WTF) it to another.
Thank you for the idea, though! I appreciate the time and brainpower you gave me.
That would probably work, but I have not explored the datamart feature much.
What I ended up doing is exporting the dataflow and adding a filter for the PreviousNMonths(2) to the tables I wanted to filter and set up a new QA dataflow that I only need to refresh once a week (or month, if MS gave us the option) in Notepad++. Tried doing it in the PQ Online in the PBI Service and it was taking forever, so I just edited the JSON manually. Took 15 minutes to filter 30 or so really large tables to a workable size for me to build datasets from. I can then reconnect them in Desktop PBI to the full dataflow after I have the dataset built.
PBI is such garbage. My old employer used Tableau and I miss it so much. Tableau wouldn't even blink at a 1m row table, much less need an hour to join ("merge" WTF) it to another.
Thank you for the idea, though! I appreciate the time and brainpower you gave me.
@DonRitchie I don't think it's possible at this current time, but what about instead of using dataflows you create a datamart. You can can build an entire model and if you still need to do additional transformations on top of it, you can connect to it via Azure SQL to return relevant fields and records.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
User | Count |
---|---|
93 | |
87 | |
84 | |
77 | |
49 |
User | Count |
---|---|
160 | |
144 | |
103 | |
74 | |
57 |