Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hola
He hecho un origen de datos desde una carpeta para que la base de datos consolidada se actualice a medida que voy ingresando archivos con información nueva. Sin embargo, al actualizar la información el refresca todos los archivos nuevamente y quisiera saber si es posible que al actualizar los datos que estan en archivos independientes solo actualice los nuevos que se deben cargar.
Gracias!
Solved! Go to Solution.
Hi @Anonymous ,
There's no way you can only update queries with new data within a Power BI report. Power Query works as follows:
Wipe all data > Import new data from all sources > Perform transformations > Send to data model.
There's no 'new data' detection. As such, the only way you could quickly and easily partition your data would be to set up all the queries that shouldn't refresh when new files are added (e.g. dimension tables, tables from sources that aren't subject to manual file updates etc.) in Dataflows, and give them their own refresh schedule.
This would mean that any sources that ARE reliant on manual file updates would need to be kept in the report, and would ALL refresh when a new file was added, but you could at least shield a number of other sources from this behaviour, potentially reducing refresh time and processing load.
Pete
Proud to be a Datanaut!
Hi @Anonymous ,
There's no way you can only update queries with new data within a Power BI report. Power Query works as follows:
Wipe all data > Import new data from all sources > Perform transformations > Send to data model.
There's no 'new data' detection. As such, the only way you could quickly and easily partition your data would be to set up all the queries that shouldn't refresh when new files are added (e.g. dimension tables, tables from sources that aren't subject to manual file updates etc.) in Dataflows, and give them their own refresh schedule.
This would mean that any sources that ARE reliant on manual file updates would need to be kept in the report, and would ALL refresh when a new file was added, but you could at least shield a number of other sources from this behaviour, potentially reducing refresh time and processing load.
Pete
Proud to be a Datanaut!
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.