The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Greetings, I have a question.
I have a couple of tables with a lot of data, one with 17 million records and the other with 19 million. In several of my dashboards I barely use 4 million of those records.
I in power Query use filters to exclude the service codes that do not interest me, and when it comes to working perfectly, I have only what is selected.
the doubt is that when I update the data the update box goes through 17 and 19 million respectively. there's no way I'll update just those 4 million segmented? or ajuro has to update everything?
Note:
Data comes from SQL Server and BD Postgre, in Import Query
There are records since 2015 and I'm only interested in 2020 for here.
Hello, @Syndicate_Admin
I suggest creating a new Query with modified SQL to take only data for 2020. This should greatly help your performance.
Greetings, thanks for the reply, but I do not have free access to the database. so I wanted to know if it was possible to reduce queries by filtering from the power Query.
another example, in that table of 17kk of records has a column with a list of services, a customer can receive several services, I need to count those who received service A, how many received services B.
but since both codes are in the same column, I duplicated the table, filtered it only by service B and made a merge query to the original table ..., which means that when it is updated it passes the 34kk of records ... and I must do that same query several times...
now I don't know if there is a more practical way to do that.
Hi, @Syndicate_Admin
If you need just a count, why not group it by Services and Customer A/B using Group By function:
ahhh I will look for a manual to understand that function well...
this is what happened with my merge query