Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
I’m trying to migrate a table containing approximately 100,000 records into a warehouse using a DACPAC file. During the deployment, I encounter the following error: "Integration Runtime busy". Is there a maximum row/record limit when moving data via DACPAC? Or could this error be related to resource limitations or configuration of the Integration Runtime?
Any guidance on best practices for handling larger tables with DACPAC would be greatly appreciated.
Solved! Go to Solution.
Hi @Mahimaa29,
Can you post a screenshot of your copy job?
When it is running, can you monitor the system resources (CPU, memory) of your gateway machine and see what is going on there?
YOu might need to get a more powerful gateway server.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @Mahimaa29,
We would like to confirm if our community members answer resolves your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you.
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
Hi @Mahimaa29,
We would like to confirm if you have resolved your query or if you need further help. If you still have any questions or need more support, please feel free to let us know. We are happy to help you. Can you please share screenshot here as @tayloramy requested
@tayloramy ,Thanks for your prompt response
Thank you for your patience and look forward to hearing from you.
Best Regards,
Prashanth Are
MS Fabric community support
I’m using the Migrate option in Microsoft Fabric. The DACPAC migration step completed successfully, but during the copy job step I face intermittent issues connecting to the on-premises gateway. Sometimes the connection works, and the data migration proceeds without issues. Other times, I encounter the following error: “Integration Runtime busy.”
Since the Integration Runtime appears to be online, I’m trying to understand what causes this inconsistent connectivity? Is there a known limitation or configuration that might trigger the “Integration Runtime busy” error? What are the best practices to avoid such interruptions during migration?
Hi @Mahimaa29,
Can you post a screenshot of your copy job?
When it is running, can you monitor the system resources (CPU, memory) of your gateway machine and see what is going on there?
YOu might need to get a more powerful gateway server.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @Mahimaa29,
You’re hitting a very “plumbing-layer” issue, not a row-count limit. A DACPAC is meant to deploy schema (and optional seed scripts) to a Warehouse. The “Integration Runtime busy” message points to runtime saturation or gateway/concurrency limits during your pipeline/connection step-not to 100k rows being “too big.”
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!