Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hello! I am new to fabric as well as all of the other devs/data engineers for the company. We are planning on setting up a data warehouse within fabric to improve scalability of our growing company and I am tasked with researching the process.
Our devs are adamant that they do not want fabric to PULL data from our DB into the data warehouse, rather we would like to schedule a data PUSH from our own web app to Fabric. Currently our web app has the capability and is optimized to push data (ad-hoc or on a schedule) to an FTP server or send a data file to an email destination. I have tried researching here on the forums and elsewhere on the world wide web if Fabric Data Warehouse can act kind of like an FTP server, where we can push a file to it that we schedule from our web app, but I am seeing a lot of results that appear more like Fabric would pull data from our DB on a schedule that we set up through a data pipeline in Fabric.
Is it possible to treat the fabric data warehouse like an FTP server for the sake of pushing data to it? If so, how do I go about setting that up? If it helps, our web app is hosted in Microsoft Entra. Any help is greatly appreciated, thank you!
Solved! Go to Solution.
You cannot treat Fabric DW like an FTP server and just push files directly into it. Fabric does not expose an FTP endpoint, nor can you "drop" files into the warehouse the way you would with FTP.
The pattern in Fabric is different:
Fabric warehouses and lakehouses sit on top of OneLake (Fabric’s data lake layer).
To load data, you either ingest into OneLake (via files, APIs, or pipelines) and then make it queryable in the Data Warehouse, or you use Fabric-native pipelines / Dataflows Gen2 to connect and load.
Since your developers want a push model instead of Fabric pulling data, here are realistic approaches:
OneLake exposes APIs (and OneLake file system connectors) that allow you to upload data files (CSV, Parquet, JSON).
Your web app can push files directly into a Fabric Lakehouse or shortcut in OneLake.
From there, you can either:
Auto-ingest into the Data Warehouse (Fabric provides COPY INTO and external tables for ingestion), or
Use a small Fabric pipeline/notebook that triggers on file arrival to load the data into warehouse tables.
(OR)
If your web app is already running in Microsoft Entra and can connect over AAD authentication, it could directly connect to the Fabric Data Warehouse endpoint (it is T-SQL compliant, like Azure Synapse).
You could use INSERT INTO statements or bulk insert via ODBC/JDBC drivers.
Downside: you have to manage batching, schema, and retries.
Hi @kevin_oleary ,
No - Fabric OneLake and Fabric Data Warehouse do not act like an FTP or SFTP server. OneLake exposes Azure Data Lake Storage Gen2 compatible APIs (DFS/ABFS), not FTP/SFTP endpoints. If you need SFTP, land files in Azure Blob Storage with SFTP enabled, then ingest into Fabric.
Docs: How do I connect to OneLake? and OneLake API parity with ADLS/Blob and SFTP support for Azure Blob Storage
-- 1) Create target table (simplified) CREATE TABLE dbo.StagingOrders( OrderId int, CustomerId int, OrderDate date, Amount decimal(18,2) ); -- 2) High-throughput load from Azure Storage (CSV or PARQUET) -- Authenticate with SAS, service principal, or managed identity. COPY INTO dbo.StagingOrders FROM 'https://<storageaccount>.dfs.core.windows.net/<container>/landing/orders/' WITH ( FILE_TYPE = 'PARQUET' -- or 'CSV' with CSV options );
Docs: COPY INTO for Fabric Warehouse
If you found this helpful, please give Kudos. If this answers your question, please mark it as a solution so others can find it.
Hi @kevin_oleary ,
Thanks for reaching out to the Microsoft fabric community forum.
I would also take a moment to thank @tayloramy and @Vinodh247 , for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.
Best Regards,
Community Support Team.
Hi @kevin_oleary ,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.
Best Regards,
Community Support Team
Hi @kevin_oleary ,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.
Best Regards,
Community Support Team
Hi @kevin_oleary ,
No - Fabric OneLake and Fabric Data Warehouse do not act like an FTP or SFTP server. OneLake exposes Azure Data Lake Storage Gen2 compatible APIs (DFS/ABFS), not FTP/SFTP endpoints. If you need SFTP, land files in Azure Blob Storage with SFTP enabled, then ingest into Fabric.
Docs: How do I connect to OneLake? and OneLake API parity with ADLS/Blob and SFTP support for Azure Blob Storage
-- 1) Create target table (simplified) CREATE TABLE dbo.StagingOrders( OrderId int, CustomerId int, OrderDate date, Amount decimal(18,2) ); -- 2) High-throughput load from Azure Storage (CSV or PARQUET) -- Authenticate with SAS, service principal, or managed identity. COPY INTO dbo.StagingOrders FROM 'https://<storageaccount>.dfs.core.windows.net/<container>/landing/orders/' WITH ( FILE_TYPE = 'PARQUET' -- or 'CSV' with CSV options );
Docs: COPY INTO for Fabric Warehouse
If you found this helpful, please give Kudos. If this answers your question, please mark it as a solution so others can find it.
This was a wealth of information and I reall appreciate the options you listed. It gave me some good guidance on what to present to my boss as a possible data solution. Thank you!
You cannot treat Fabric DW like an FTP server and just push files directly into it. Fabric does not expose an FTP endpoint, nor can you "drop" files into the warehouse the way you would with FTP.
The pattern in Fabric is different:
Fabric warehouses and lakehouses sit on top of OneLake (Fabric’s data lake layer).
To load data, you either ingest into OneLake (via files, APIs, or pipelines) and then make it queryable in the Data Warehouse, or you use Fabric-native pipelines / Dataflows Gen2 to connect and load.
Since your developers want a push model instead of Fabric pulling data, here are realistic approaches:
OneLake exposes APIs (and OneLake file system connectors) that allow you to upload data files (CSV, Parquet, JSON).
Your web app can push files directly into a Fabric Lakehouse or shortcut in OneLake.
From there, you can either:
Auto-ingest into the Data Warehouse (Fabric provides COPY INTO and external tables for ingestion), or
Use a small Fabric pipeline/notebook that triggers on file arrival to load the data into warehouse tables.
(OR)
If your web app is already running in Microsoft Entra and can connect over AAD authentication, it could directly connect to the Fabric Data Warehouse endpoint (it is T-SQL compliant, like Azure Synapse).
You could use INSERT INTO statements or bulk insert via ODBC/JDBC drivers.
Downside: you have to manage batching, schema, and retries.
I appreciate the response. I'm not sure what will work for us, but it helped me get some ideas as to how to present this as a potential data solution to my boss. Thank you!
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Fabric update to learn about new features.
User | Count |
---|---|
4 | |
2 | |
1 | |
1 | |
1 |