Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now

Write Data into Data Warehouse from Fabric Notebook

Write data directly into a data warehouse using the Fabric notebook, which utilizes Spark (let Fabric handle the staging process behind the scenes). This functionality resembles what is currently available in Synapse Analytics Workspace.

Status: New
Comments
william13
New Member

Agreed - this would be very useful for companies looking to implement a LH-LH-DWH medallion architecture, writing from Silver LH to Gold DWH.

Also this limitation damages the overarching narrative around OneLake: to have flexibility behind reading and writing data between the different engines.

sukumar2
New Member

This feature would help us convincing customer to transition over to Microsoft Fabric

sri2
New Member

Please either allow through Sql Connection string


import com.microsoft.spark.sqlanalytics

from com.microsoft.spark.sqlanalytics.Constants import Constants

df.write\

 .option(Constants.SERVER, "")\

 .mode("overwrite")\

 .synapsesql("")


or directly using saveAsTable method


df.write.format("delta").option('delta.columnMapping.mode' , 'name').mode("overwrite")\

.saveAsTable("TABLE_NAME", path="abfss://WORKSPACE_NAME@onelake.dfs.fabric.microsoft.com/WAREHOUSE_NAME/dbo/Tables/TABLE_NAME")




Jugi
Microsoft Employee

Why would you need to load the data into the Warehouse via Notebook? You can write the data to the Lakehouse and directly query it from the Warehouse


CREATE TABLE [research_warehouse].[dbo].[cases_by_continent]

AS

SELECT

FROM [cases_lakehouse].[dbo].[bing_covid-19_data] cases


https://learn.microsoft.com/en-us/fabric/data-warehouse/ingest-data-tsql#ingesting-data-from-tables-on-different-warehouses-and-lakehouses

Jatin_Hingorani
New Member

This is a fundamental feature for transferring incremental loads from the (LH) to the (WH) tables.

Our clients primarily use a bronze-to-silver architecture, and we cannot afford to overwrite WH tables every time new data arrives, as it is neither efficient nor scalable. Instead, incremental data loading with upsert capabilities is crucial to maintaining data integrity while minimizing resource usage.

This functionality is readily available in Azure Synapse Analytics, and having it in Microsoft Fabric is essential to meet our clients' expectations. Without it, we risk losing their confidence in moving to Fabric, as the ability to handle incremental updates efficiently is a key requirement in modern data architectures. Implementing this feature will ensure we can offer a smooth transition for clients and provide the performance and flexibility they need for their data workloads.

abauman1
New Member

Our metadata and logging tables are in a data warehouse, but we have some ELT jobs that run from notebooks. This functionality is key to allow us to log data into our data warehouse. It is also silly that I can read data from the same data warehouse using spark SQL, but I cannot insert records into it.


On a side note, this ideas forum sucks. I tried to provide more details and code snippets, but I get the error message "We have encounter some malicious input. Please remove that and try again." Thanks Microsoft, your error message isn't even correct.

victor_andarcia
New Member

This should be a key feature... our enterprise is planning to use a medallion architecture where bronze and silver ETL processes are done through notebooks and LH, however, we also want to leverage the use of a metadata config DW. We need a way to update from those ETL notebooks attributes on the config WH such as LastExecutionDT...

fbcideas_migusr
New Member
Status changed to: New