Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
kevin_oleary
Regular Visitor

Schedule Extract from Web App to Push to Fabric Data Warehouse

Hello! I am new to fabric as well as all of the other devs/data engineers for the company. We are planning on setting up a data warehouse within fabric to improve scalability of our growing company and I am tasked with researching the process. 

 

Our devs are adamant that they do not want fabric to PULL data from our DB into the data warehouse, rather we would like to schedule a data PUSH from our own web app to Fabric. Currently our web app has the capability and is optimized to push data (ad-hoc or on a schedule) to an FTP server or send a data file to an email destination. I have tried researching here on the forums and elsewhere on the world wide web if Fabric Data Warehouse can act kind of like an FTP server, where we can push a file to it that we schedule from our web app, but I am seeing a lot of results that appear more like Fabric would pull data from our DB on a schedule that we set up through a data pipeline in Fabric.

 

Is it possible to treat the fabric data warehouse like an FTP server for the sake of pushing data to it? If so, how do I go about setting that up? If it helps, our web app is hosted in Microsoft Entra. Any help is greatly appreciated, thank you!

2 ACCEPTED SOLUTIONS
Vinodh247
Solution Supplier
Solution Supplier

You cannot treat Fabric DW like an FTP server and just push files directly into it. Fabric does not expose an FTP endpoint, nor can you "drop" files into the warehouse the way you would with FTP. 

 

The pattern in Fabric is different:

  • Fabric warehouses and lakehouses sit on top of OneLake (Fabric’s data lake layer).

  • To load data, you either ingest into OneLake (via files, APIs, or pipelines) and then make it queryable in the Data Warehouse, or you use Fabric-native pipelines / Dataflows Gen2 to connect and load.

 

Since your developers want a push model instead of Fabric pulling data, here are realistic approaches:

Pushing to OneLake via APIs:

  • OneLake exposes APIs (and OneLake file system connectors) that allow you to upload data files (CSV, Parquet, JSON).

  • Your web app can push files directly into a Fabric Lakehouse or shortcut in OneLake.

  • From there, you can either:

    • Auto-ingest into the Data Warehouse (Fabric provides COPY INTO and external tables for ingestion), or

    • Use a small Fabric pipeline/notebook that triggers on file arrival to load the data into warehouse tables.

(OR)

 

  • If your web app is already running in Microsoft Entra and can connect over AAD authentication, it could directly connect to the Fabric Data Warehouse endpoint (it is T-SQL compliant, like Azure Synapse).

  • You could use INSERT INTO statements or bulk insert via ODBC/JDBC drivers.

  • Downside: you have to manage batching, schema, and retries.

 

 

 

Please 'Kudos' and 'Accept as Solution' if this answered your query.

Regards,
Vinodh
Microsoft MVP [Fabric]

View solution in original post

tayloramy
Community Champion
Community Champion

Hi @kevin_oleary ,

No - Fabric OneLake and Fabric Data Warehouse do not act like an FTP or SFTP server. OneLake exposes Azure Data Lake Storage Gen2 compatible APIs (DFS/ABFS), not FTP/SFTP endpoints. If you need SFTP, land files in Azure Blob Storage with SFTP enabled, then ingest into Fabric.
Docs: How do I connect to OneLake? and OneLake API parity with ADLS/Blob and SFTP support for Azure Blob Storage

A push-friendly pattern that works

  1. Land files where your app can push them:
  2. Ingest into the Fabric Warehouse:
  3. Triggering (schedule or event-based):

Why this fits a push model

Nice extras

  • Shortcuts: if data lives in ADLS Gen2, create a OneLake Shortcut to that path for lakehouse exploration, while Warehouse loads still use COPY INTO. Docs: OneLake shortcuts overview and Create an ADLS Gen2 shortcut
  • Email sources: if the app only emails files, a small Logic App can save attachments into Blob, then your pipeline runs as above. (Typical Azure pattern; pair with the event trigger doc above.)

Example: load CSV or Parquet into Warehouse with T-SQL

-- 1) Create target table (simplified)
CREATE TABLE dbo.StagingOrders(
  OrderId int, CustomerId int, OrderDate date, Amount decimal(18,2)
);

-- 2) High-throughput load from Azure Storage (CSV or PARQUET)
--    Authenticate with SAS, service principal, or managed identity.
COPY INTO dbo.StagingOrders
FROM 'https://<storageaccount>.dfs.core.windows.net/<container>/landing/orders/'
WITH (
  FILE_TYPE = 'PARQUET'  -- or 'CSV' with CSV options
);

Docs: COPY INTO for Fabric Warehouse

 

If you found this helpful, please give Kudos. If this answers your question, please mark it as a solution so others can find it.

View solution in original post

7 REPLIES 7
v-menakakota
Community Support
Community Support

Hi @kevin_oleary   ,
Thanks for reaching out to the Microsoft fabric community forum. 

 

I would also take a moment to thank  @tayloramy  and @Vinodh247  , for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.

Best Regards, 
Community Support Team.

Hi @kevin_oleary  ,

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.

Best Regards, 
Community Support Team 

Hi @kevin_oleary  ,

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.

Best Regards, 
Community Support Team 

tayloramy
Community Champion
Community Champion

Hi @kevin_oleary ,

No - Fabric OneLake and Fabric Data Warehouse do not act like an FTP or SFTP server. OneLake exposes Azure Data Lake Storage Gen2 compatible APIs (DFS/ABFS), not FTP/SFTP endpoints. If you need SFTP, land files in Azure Blob Storage with SFTP enabled, then ingest into Fabric.
Docs: How do I connect to OneLake? and OneLake API parity with ADLS/Blob and SFTP support for Azure Blob Storage

A push-friendly pattern that works

  1. Land files where your app can push them:
  2. Ingest into the Fabric Warehouse:
  3. Triggering (schedule or event-based):

Why this fits a push model

Nice extras

  • Shortcuts: if data lives in ADLS Gen2, create a OneLake Shortcut to that path for lakehouse exploration, while Warehouse loads still use COPY INTO. Docs: OneLake shortcuts overview and Create an ADLS Gen2 shortcut
  • Email sources: if the app only emails files, a small Logic App can save attachments into Blob, then your pipeline runs as above. (Typical Azure pattern; pair with the event trigger doc above.)

Example: load CSV or Parquet into Warehouse with T-SQL

-- 1) Create target table (simplified)
CREATE TABLE dbo.StagingOrders(
  OrderId int, CustomerId int, OrderDate date, Amount decimal(18,2)
);

-- 2) High-throughput load from Azure Storage (CSV or PARQUET)
--    Authenticate with SAS, service principal, or managed identity.
COPY INTO dbo.StagingOrders
FROM 'https://<storageaccount>.dfs.core.windows.net/<container>/landing/orders/'
WITH (
  FILE_TYPE = 'PARQUET'  -- or 'CSV' with CSV options
);

Docs: COPY INTO for Fabric Warehouse

 

If you found this helpful, please give Kudos. If this answers your question, please mark it as a solution so others can find it.

This was a wealth of information and I reall appreciate the options you listed. It gave me some good guidance on what to present to my boss as a possible data solution. Thank you!

Vinodh247
Solution Supplier
Solution Supplier

You cannot treat Fabric DW like an FTP server and just push files directly into it. Fabric does not expose an FTP endpoint, nor can you "drop" files into the warehouse the way you would with FTP. 

 

The pattern in Fabric is different:

  • Fabric warehouses and lakehouses sit on top of OneLake (Fabric’s data lake layer).

  • To load data, you either ingest into OneLake (via files, APIs, or pipelines) and then make it queryable in the Data Warehouse, or you use Fabric-native pipelines / Dataflows Gen2 to connect and load.

 

Since your developers want a push model instead of Fabric pulling data, here are realistic approaches:

Pushing to OneLake via APIs:

  • OneLake exposes APIs (and OneLake file system connectors) that allow you to upload data files (CSV, Parquet, JSON).

  • Your web app can push files directly into a Fabric Lakehouse or shortcut in OneLake.

  • From there, you can either:

    • Auto-ingest into the Data Warehouse (Fabric provides COPY INTO and external tables for ingestion), or

    • Use a small Fabric pipeline/notebook that triggers on file arrival to load the data into warehouse tables.

(OR)

 

  • If your web app is already running in Microsoft Entra and can connect over AAD authentication, it could directly connect to the Fabric Data Warehouse endpoint (it is T-SQL compliant, like Azure Synapse).

  • You could use INSERT INTO statements or bulk insert via ODBC/JDBC drivers.

  • Downside: you have to manage batching, schema, and retries.

 

 

 

Please 'Kudos' and 'Accept as Solution' if this answered your query.

Regards,
Vinodh
Microsoft MVP [Fabric]

I appreciate the response. I'm not sure what will work for us, but it helped me get some ideas as to how to present this as a potential data solution to my boss. Thank you!

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.