Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
rui_mtcarvalho
Regular Visitor

Dataframe write to Warehouse through Notebook. Is possible?

Hi, 

 

I believe is not possible to save the content from a dataframe directly to a Warehouse. Since I can only run Notebooks on Lakehouse.

 

Is this true? Or theres a way to do it that I´m not seeing, 

 

Thanks in advance!

1 ACCEPTED SOLUTION
puneetvijwani
Resolver IV
Resolver IV

@rui_mtcarvalho you can write the data to lake house and use it with three part name in warehouse 
Here is the Session where pawel showed this in our community session , skip to 1:02
https://www.youtube.com/watch?v=u7peyrNhYgw

View solution in original post

4 REPLIES 4
leisun
Microsoft Employee
Microsoft Employee

we can use spark jdbc way , but the perfoamce is pretty bad.  the easiest way is to use adf copy activity.

 

import os

from pyspark.sql.types import StringType

 

# acquire access token

access_token = mssparkutils.credentials.getToken("pbi")

# define JDBC URL and connection properties

jdbc_url = "jdbc:sqlserver://<dwhost>:1433;database=<dwName>;encrypt=true;trustServerCertificate=true;hostNameInCertificate=*.database.windows.net;loginTimeout=30"

 

df = spark.sql("select 1 as id , 'james' as name, 1.3 as cost, 'jack' as name2 ")

 

def map_to_sql_type(data_type):

    if isinstance(data_type, StringType):

        return "VARCHAR(255)"

    return ""  # Return an empty string for non-StringType

 

# Get the schema of the DataFrame

schema = df.schema

 

# Generate the SQL schema string, skipping non-StringType columns

sql_schema = ", ".join([f"{field.name} {map_to_sql_type(field.dataType)}" for field in schema if isinstance(field.dataType, StringType)])

 

connection_properties = {

    "accessToken": access_token,

    "driver": "com.microsoft.sqlserver.jdbc.SQLServerDriver"

}

 

# read data from sql DW

df.write.option("createTableColumnTypes", sql_schema).mode("overwrite").jdbc(

    url=jdbc_url,

    table="test6",

    properties=connection_properties

)

 

Please change the red part accordingly in above code,

Dwhost can be found in

leisun_0-1725529564277.png

 

BryanCarmichael
Helper II
Helper II

Do you need it in the Warehouse given you can consume it in the Warehosue from the Lakehouse with no perceivable performance impact ?

 

If you do then why not use a datapipeline to run your notebook to load to the warehouse and then use a on success copy data activity to move to from lakehouse to warehouse ?

puneetvijwani
Resolver IV
Resolver IV

@rui_mtcarvalho you can write the data to lake house and use it with three part name in warehouse 
Here is the Session where pawel showed this in our community session , skip to 1:02
https://www.youtube.com/watch?v=u7peyrNhYgw

DennesTorres
Impactful Individual
Impactful Individual

Hi,

If the lakehouse has a shortcut to the datawarehouse, I believe you may be able to save the dataframe to the warehouse through the shortcut.

Kind Regards,

 

Dennes

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.