Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi
I'm fairly new working with Fabric/DWH in general.
We have some etl processes outside fabric - where we are able to provide a finished dataset.
And I was hoping that I would write directly into a DWH table from my external app.
And it does work, but when I look at the query from DWH it makes a query per row in my dataset, meaning i took me almost 2 hours to insert 8k rows.
Is it not possible to do this directly?
Using either oledb or bulk insert to DWH?
Solved! Go to Solution.
Hi @Fragmaticx,
Thank you for reaching out to Microsoft Fabric Community Forum.
It appears that you're encountering slow performance when inserting data directly into your DWH table from an external application. This often occurs if each row is being inserted individually, which is not efficient for large datasets.
Consider using data pipelines to load bulk data from storage solutions like Azure Blob Storage or other external sources into your DWH.
Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to give "Kudos"
Regards,
Vinay Pabbu
Hi @Fragmaticx,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Vinay Pabbu
Hi @Fragmaticx,
May I ask if you have gotten this issue resolved?
If it is solved, please mark the helpful reply or share your solution and accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.
Regards,
Vinay Pabbu
Hi @Fragmaticx,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Vinay Pabbu
Hi @Fragmaticx ,
To efficiently write data directly to a Microsoft Fabric Data Warehouse (DWH) from an external application, you need to use optimized data ingestion methods that avoid row-by-row inserts, which are inherently slow
The COPY INTO command is a high-performance method for bulk loading data into Microsoft Fabric DWH. It supports loading data from external sources such as Azure Blob Storage or ADLS Gen2 in formats like Parquet or CSV. This method is highly efficient and can handle millions of rows in minutes.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Hi @Fragmaticx,
Thank you for reaching out to Microsoft Fabric Community Forum.
It appears that you're encountering slow performance when inserting data directly into your DWH table from an external application. This often occurs if each row is being inserted individually, which is not efficient for large datasets.
Consider using data pipelines to load bulk data from storage solutions like Azure Blob Storage or other external sources into your DWH.
Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to give "Kudos"
Regards,
Vinay Pabbu
Might be better to pull instead of push. If you already have a finished dataset then maybe use pipelines or DF Gen2 to pull that in and fast copy it to the DWH.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
5 | |
3 | |
3 | |
3 | |
2 |