Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
Fragmaticx
Frequent Visitor

Write data directly to DWH from external application?

Hi 

I'm fairly new working with Fabric/DWH in general. 

We have some etl processes outside fabric - where we are able to provide a finished dataset.
And I was hoping that I would write directly into a DWH table from my external app.

And it does work, but when I look at the query from DWH it makes a query per row in my dataset, meaning i took me almost 2 hours to insert 8k rows.

Is it not possible to do this directly?

Using either oledb or bulk insert to DWH?


1 ACCEPTED SOLUTION
v-vpabbu
Community Support
Community Support

Hi @Fragmaticx,

 

Thank you for reaching out to Microsoft Fabric Community Forum.

 

It appears that you're encountering slow performance when inserting data directly into your DWH table from an external application. This often occurs if each row is being inserted individually, which is not efficient for large datasets.

Consider using data pipelines to load bulk data from storage solutions like Azure Blob Storage or other external sources into your DWH.


Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to give "Kudos"

 

Regards,

Vinay Pabbu

View solution in original post

6 REPLIES 6
v-vpabbu
Community Support
Community Support

Hi @Fragmaticx,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Regards,

Vinay Pabbu

Hi @Fragmaticx,


May I ask if you have gotten this issue resolved?

If it is solved, please mark the helpful reply or share your solution and accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.

 

Regards,

Vinay Pabbu

Hi @Fragmaticx,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Regards,

Vinay Pabbu

nilendraFabric
Community Champion
Community Champion

Hi @Fragmaticx ,

 

To efficiently write data directly to a Microsoft Fabric Data Warehouse (DWH) from an external application, you need to use optimized data ingestion methods that avoid row-by-row inserts, which are inherently slow

 

Use the COPY INTO Command

The COPY INTO command is a high-performance method for bulk loading data into Microsoft Fabric DWH. It supports loading data from external sources such as Azure Blob Storage or ADLS Gen2 in formats like Parquet or CSV. This method is highly efficient and can handle millions of rows in minutes.

 

COPY INTO [dbo].[YourTable] FROM 'https://yourstorageaccount.blob.core.windows.net/container/yourfile.csv' WITH ( FILE_TYPE = 'CSV', CREDENTIAL = (IDENTITY = 'ManagedIdentity') );
 
You can connect your external application to the Fabric DWH using OLE DB or ODBC drivers. Ensure that you configure the connection for bulk operations:
  • Use "Fast Load" mode in OLE DB to enable batch inserts.

 

 


If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

v-vpabbu
Community Support
Community Support

Hi @Fragmaticx,

 

Thank you for reaching out to Microsoft Fabric Community Forum.

 

It appears that you're encountering slow performance when inserting data directly into your DWH table from an external application. This often occurs if each row is being inserted individually, which is not efficient for large datasets.

Consider using data pipelines to load bulk data from storage solutions like Azure Blob Storage or other external sources into your DWH.


Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to give "Kudos"

 

Regards,

Vinay Pabbu

lbendlin
Super User
Super User

Might be better to pull instead of push.  If you already have a finished dataset then maybe use pipelines or DF Gen2 to pull that in and fast copy it to the DWH.

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.