Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
Srisakthi
Continued Contributor
Continued Contributor

When to use T-SQL notebook

Hello All,

 

Can you please suggest on what scenario i should consider using T-SQL notebooks for transformations.

1. I have to use only If I have warehouse tables and i need to do transformations on those data?

2. Can I use T-SQL notebooks on lakehouse tables for transformation and push the transformed data to warehouse?

 

Regards,

Srisakthi

1 ACCEPTED SOLUTION

Hi @Srisakthi IMHO T-SQL support in Notebooks is really for adhoc SELECT analysis over the Lakehouse SQL Endpoint.  I would not look do do any ETL/ELT code in Notebooks.

 

So for me, I use it for analysis of data, not for loading/engineering (Lakehouse SQL Endpoint is read only, cannot write data using T-SQL).

 

Hope that helps.

View solution in original post

4 REPLIES 4
Srisakthi
Continued Contributor
Continued Contributor

Hi @Anonymous ,

 

Thanks for your response. What I'm looking for is in which Scenario I should be using T-SQL notebooks. Because in data pipeline there is no option for T-SQL notebooks. T-SQL notebook to be used for manual analysis and has to be ran manually, no automation for this ?

 

Regards,

Srisakthi

Hi @Srisakthi IMHO T-SQL support in Notebooks is really for adhoc SELECT analysis over the Lakehouse SQL Endpoint.  I would not look do do any ETL/ELT code in Notebooks.

 

So for me, I use it for analysis of data, not for loading/engineering (Lakehouse SQL Endpoint is read only, cannot write data using T-SQL).

 

Hope that helps.

Anonymous
Not applicable

Hi @Srisakthi,

As you said, current it seems not support directly use t-sql model notebook.
For this scenario, I'd like to suggest create a new common notebook to invoke in pipeline and enter the notebook and manually switch its mode to 't-sql'. (I test with this and it works normally)

Regards,

Xiaoxin Sheng

Anonymous
Not applicable

Hi @Srisakthi,

#1, I think you can also use it to transform and show the temporary reuslt or save queries as view.

#2, I think dataflow or pipeline features should more suitable to push data from the Lakehouse. If you only use notebook, you may need to use some libraries and pyspark scripts to acheive these. (e.g. use mssparkutils and jdbc driver to write data from dataframe)

Solved: Re: Dataframe write to Warehouse through Notebook.... - Microsoft Fabric Community

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.