Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
Good Afternoon,
I would like to be able to directly compare the time and resource utilisation for multiple methods of loading data. Specifically I would like to compare using a dataflow and a pipeline copy data activity for loading data from dataverse. Can anyone suggest a method that would give me this granularity of resource usage data?
Thanks in advance
Solved! Go to Solution.
Hi @JonBFabric , Thank you for reaching out to the Microsoft Community Forum.
There is currently no built in way in Fabric to directly compare CU consumption and execution time between a Dataverse load done via a dataflow and one done via a pipeline copy activity. Fabric does not expose ingestion method level benchmarking or side by side resource attribution. CU is surfaced as telemetry at the capacity, workspace and item execution level only.
The best approach today is to run controlled, isolated tests. Execute the dataflow refresh and the pipeline copy separately, using the same Dataverse tables and similar data volumes, on the same capacity. Use item history (dataflow refresh history and pipeline run details) to capture duration, then correlate those time windows with CU usage in the Capacity Metrics app. This allows you to reasonably attribute CU and time to each method, provided nothing else is running on the capacity during those windows.
The upcoming Capacity Operation Events preview will improve visibility by exposing operation level CU events, making this correlation cleaner, but it still won’t automatically compare ingestion methods for you. Even with that preview, a controlled test setup will remain necessary to get a fair and defensible comparison.
Hi @JonBFabric , Thank you for reaching out to the Microsoft Community Forum.
There is currently no built in way in Fabric to directly compare CU consumption and execution time between a Dataverse load done via a dataflow and one done via a pipeline copy activity. Fabric does not expose ingestion method level benchmarking or side by side resource attribution. CU is surfaced as telemetry at the capacity, workspace and item execution level only.
The best approach today is to run controlled, isolated tests. Execute the dataflow refresh and the pipeline copy separately, using the same Dataverse tables and similar data volumes, on the same capacity. Use item history (dataflow refresh history and pipeline run details) to capture duration, then correlate those time windows with CU usage in the Capacity Metrics app. This allows you to reasonably attribute CU and time to each method, provided nothing else is running on the capacity during those windows.
The upcoming Capacity Operation Events preview will improve visibility by exposing operation level CU events, making this correlation cleaner, but it still won’t automatically compare ingestion methods for you. Even with that preview, a controlled test setup will remain necessary to get a fair and defensible comparison.
Hi @JonBFabric,
If you want to get capacity utilization data, I recommend waiting until this preview comes out:
This should allow you to see exactly how much CU is being consumed.
Other than that, the best approach is to use the Capacity Metrics app and wait for data to appear, can take between 10-20 minutes.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.