Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
CrhIT
Frequent Visitor

Storing Data from a Notebook into a Warehouse in Microsoft Fabric

Hello everyone,

I am wondering if it is possible to store data directly from a notebook into a warehouse in Microsoft Fabric. If this is not feasible, I would like to implement a copy data activity that only loads data incrementally using a date field.

Could anyone guide me on how to set this up or share best practices for this approach?

Thank you in advance for your help!

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @CrhIT ,

I think you can do the steps below:

1. Just as CrhIT said, Microsoft announced the public preview of T-SQL notebooks in Fabric.

 

2. You can first store your data in a Lakehouse using Python cells in your notebook. Then, use T-SQL cells to create tables in the warehouse from the Lakehouse tables. Here is the document you can read: Data warehouse tutorial - analyze data with a notebook - Microsoft Fabric | Microsoft Learn

vyilongmsft_0-1729215482282.png

 

3. If direct storage isn't feasible or you prefer a different approach, you can use a Data Factory pipeline to implement incremental data loading:

In the Data Factory section, create a new pipeline and name it appropriately. Add a Copy Data activity from the Move & Transform section.

Use a date field to filter and load only new or updated records. You can set this up in the Source settings of the Copy Data activity by specifying a query that selects data based on the date field.

vyilongmsft_1-1729215931958.png

You can also look at this document: Data warehouse tutorial - ingest data into a Warehouse in Microsoft Fabric - Microsoft Fabric | Micr...

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

2 REPLIES 2
Anonymous
Not applicable

Hi @CrhIT ,

I think you can do the steps below:

1. Just as CrhIT said, Microsoft announced the public preview of T-SQL notebooks in Fabric.

 

2. You can first store your data in a Lakehouse using Python cells in your notebook. Then, use T-SQL cells to create tables in the warehouse from the Lakehouse tables. Here is the document you can read: Data warehouse tutorial - analyze data with a notebook - Microsoft Fabric | Microsoft Learn

vyilongmsft_0-1729215482282.png

 

3. If direct storage isn't feasible or you prefer a different approach, you can use a Data Factory pipeline to implement incremental data loading:

In the Data Factory section, create a new pipeline and name it appropriately. Add a Copy Data activity from the Move & Transform section.

Use a date field to filter and load only new or updated records. You can set this up in the Source settings of the Copy Data activity by specifying a query that selects data based on the date field.

vyilongmsft_1-1729215931958.png

You can also look at this document: Data warehouse tutorial - ingest data into a Warehouse in Microsoft Fabric - Microsoft Fabric | Micr...

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

FabianSchut
Super User
Super User

Hi @CrhIT,

 

Microsoft announced the public preview of T-SQL notebooks in Fabric: https://blog.fabric.microsoft.com/en-us/blog/announcing-public-preview-of-t-sql-notebook-in-fabric/. With this feature, you are able to modify the datawarehouse tables. You ask whether it is possible to store data from a notebook into a warehouse. Which data are you talking about? Are you first retrieving some data using Python cells?

If that is the case, you could first store this data in the lakehouse using Python and use a T-SQL cell to create a table in the warehouse from the lakehouse table with the 'create table as' function. I'm not sure whether that is possible within a single notebook or that you should create two notebooks.

Using this approach, you could load the data incrementally with the Python script first in the lakehouse and then copy it to the warehouse. When you run this more than once (what you probably will), don't forget to drop the old warehouse table before you create the table as a copy of the lakehouse table in the second run.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.