Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hello everyone,
We are currently testing the copyjob functionality in the fabric.
We always load from the same on prem SQL server.
The exciting thing is that each process takes a comparatively large number of CU(s):
Regardless of whether it is a full load with 1 million rows (<4 minutes) or an incremental load 10 minutes later, which only reloads 26 rows (1<min), for example.
Always 4*5400 CU(s) + 360 CU(s). This means that we utilise 12.71% of the daily capacity of an F2 capacity for each load, i.e. we could make 6 loads throughout the day. That is far too many resources for these comparatively low requests.
Now to my question:
How can this match up with the 1.5 CU / hour = 5400CU(s) noted in the docs?
Pricing for data pipelines - Microsoft Fabric | Microsoft Learn
They match the 360 CU(s) because that is roughly equivalent to 4 minutes, but the 4*5400 CU(s) make the copy job unusable because that is far too much. Could this be a bug and not be calculated for every load?
Thank you for your help!
Stewwe
Solved! Go to Solution.
@Anonymous @Fabrico
Hello together,
The ticket to Microsoft has had an effect. The problem on my side is solved: the 4*5400 are gone 🙂
@Stewwe Can I ask you what the report is your are using, is it an adaptation of the Fabric Metric App.
My test did exactly the same thing with exactly the same numbers. I imagine it's a bug and needs to be fixed otherwise nobody on a capacity below F8 or F16 would touch these things.
Seems odd to me that the figures are nice round numbers and always the same, feels like a placeholder that someone forgot to remove before pushing into Preview.
Hey Fabrico,
Thank you for your feedback. Let's wait and see when Microsoft has clarified the case. 🙂
Regards
Stewwe
@Anonymous @Fabrico
Hello together,
The ticket to Microsoft has had an effect. The problem on my side is solved: the 4*5400 are gone 🙂
Hi @Stewwe ,
If you're seeing 5400 CUs, this would imply a very high duration or multiple activities being calculated together. For example, 5400 CUs would correspond to 3600 hours of data movement (5400 CUs / 1.5 CU per hour).
If 360 CUs are being consumed, this would be equivalent to 240 hours of data movement (360 CUs / 1.5 CU per hour), which is indeed far more than 4 minutes.
Given that Copy Job is still in the preview stage, there are still a lot of things that need to be improved. So I think you can keep an eye on the official documentation in real time.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello Yilong,
Thank you for your reply.
I understand the capacities a little differently:
You have 60CU per timepoint (30 seconds) with an F2, i.e. per hour you have 60CU*2(Timepoints)*60 minutes = 7200 CU(s) per hour.
Plan your capacity size - Microsoft Fabric | Microsoft Learn
Similar calculation for the DataMovement: 1.5 CU >>> 5400 CU(s)
Back to this case:
‘Given that Copy Job is still in the preview stage, there are still a lot of things that need to be improved.’ >>>Agree...
I also think that we are still in the preview stage here and that the real consumption is still being improved.
So we have to wait 🙂
Regards
Stewwe
Hi @Stewwe ,
Yes, I think what you have articulated is correct.
Again, like I said earlier, hopefully there will be some updates in the follow-up process until Copyjob is officially up and running.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello Yilong,
have you passed the case on internally or should I open a ticket?
Kind regards
Stewwe
Hi @Stewwe ,
We only provide the necessary Q&A and advice on the forum. And can't pass the case on internally.
I think you can open a ticket on your own and can get your doubts answered by a professional.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!