Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
AdsLob
Regular Visitor

Switching Lakehouses in a notebook

Hi, 

 

I am wondering if MS Fabric supports switching from one lakehouse to another without attaching them into the notebook. 

My goal is to develop a notebook to manage the creation of all delta table (and some other operations). With the Delta package, it appears that specifying the lakehouse name rather than the abfss path is necessary (though I might be mistaken about this).

 

For example in the code snippet below, I am creating two delta tables located in seperate lakehouses: 

 

 

DeltaTable.createIfNotExists(spark) \
    .tableName(f"{lh_admin}.{table_name}") \
    .addColumns(process_schema) \
    .execute()

DeltaTable.createIfNotExists(spark) \
    .tableName(f"{lh_products}.{table_name}") \
    .addColumns(sku_schema) \
    .execute()

 

 

However, without attaching any lakehouse, I encounter the following error:

"org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Spark SQL queries are only possible in the context of a lakehouse. Please attach a lakehouse to proceed.)".

 

In another discussion they have suggested to use the code below 

 

%%configure
{
    "defaultLakehouse": 
        { 
            "name": "DE_LH_999_Administration",
            "id": "8b5713a9-9848-4796-b636-6d4ec9867448"
        }
}

 

However, this code requires to restart "Livy Session",  which doesn't seem like a suitable approach for switching to another lakehouse.

 

Does anyone have a solution?

 

Thanks

1 ACCEPTED SOLUTION

Hi @v-cboorla-msft 

Thanks for your reply. 

As I said in a previous message I don't have that kind of problem using spark dataframe as we can use abfss path. 

The issue I am having is to use DeltaTable Package in a notebook where no default lakehouse is defined. In other words I want, in only one notebook, to create all my DeltaTables for three different lakehouses.

As DeltaTable package uses the notation {lakehouseName}.{tableName} (createIfNotExists), it seems that there is no alternative than attaching lakehouses to the notebook

 

Thanks. 

View solution in original post

11 REPLIES 11
AdsLob
Regular Visitor

Hi, 

 

Thanks you for your reply. 


I am not really having issues using spark dataframe as you suggested. 

 

As I need to create my two tables before any other actions, I thought using DeltaTable class might be a good idea. 
If we consider the two lakehouses are stored in the same workspace and none of them are attached to the notebook. Do you think it could be possible to use  {lakehouseName}.{tableName}

 

Or is there a way of attaching the two lakehouses in the notebook in python code (without using default)? 
When we deploy to production model, ID's won't be the same and attaching each notebook to a lakehouse manually might be tedius. 

Thanks. 

frithjof_v
Continued Contributor
Continued Contributor

I think maybe this blog can help:

 

https://fabric.guru/how-to-mount-a-lakehouse-and-identify-the-mounted-lakehouse-in-fabric-notebook

 

I haven't tried this myself

 

 

I think in order to use {lakehouseName}.{tableName} notation, it can only be used for lakehouses in the same workspace as the notebook's default lakehouse (doesn't have to be the same workspace as the notebook resides in). So I think you must have a default lakehouse in the notebook to use that notation style.

Yes I have already tried to mount a lakehouse. 
Using this method we can see all Delta Table and the "Files" folder but we cannot apply this to Delta Table class. 
As you said it seems that there are no alternative than attaching lakehouses to the notebook to use {lakehouseName}.{tableName} notation

Hi @AdsLob 

 

Thanks for using Microsoft Fabric Community.

Apologies for the delay in response from my end.

For a similar inquiry, you can reference the following link : MS Fabric - Notebook data load from a lakehouse that's NOT default? - Stack Overflow

I hope this information helps. Please do let us know if you have any further questions.

 

Thanks.

Hi @AdsLob 

 

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.

 

Thanks.

Hi @v-cboorla-msft 

Thanks for your reply. 

As I said in a previous message I don't have that kind of problem using spark dataframe as we can use abfss path. 

The issue I am having is to use DeltaTable Package in a notebook where no default lakehouse is defined. In other words I want, in only one notebook, to create all my DeltaTables for three different lakehouses.

As DeltaTable package uses the notation {lakehouseName}.{tableName} (createIfNotExists), it seems that there is no alternative than attaching lakehouses to the notebook

 

Thanks. 

Hi @AdsLob 

 

Apologies for the delay in response.

 

Yes, direct creation of Delta tables within Microsoft Fabric Lakehouses from notebooks is currently not supported without explicit Lakehouse attachment. This approach prioritizes two key objectives security and efficient configuration.

Security: Attaching the Lakehouse ensures authorized access and granular control over the data. This safeguards sensitive information within the Lakehouse environment.

Efficient Configuration: Lakehouse attachment streamlines notebook configuration by automatically provisioning the necessary settings for interacting with the storage system and Delta Lake format. This includes authentication details and relevant connection parameters.

 

I hope this information helps. Please do let us know if you have further queries.

 

Thank you.

 

Hi @v-cboorla-msft 

 

That is what I thought. 

Thank your for your explanation. 




Hi @AdsLob 

 

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.

 

Thanks.

Hi @AdsLob 

 

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others .
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread.

 

Thanks.

frithjof_v
Continued Contributor
Continued Contributor

I think the notation you are using {lakehouseName}.{tableName} only works in the context of the Fabric workspace where your notebook's default Lakehouse resides.

 

If you want to access/write to Lakehouses in other workspaces (or if you don't select a default Lakehouse for your Notebook) then I think you need to use the abfss path.

 

Something like this: 

 

df.write.format("delta").mode("overwrite").save(f"abfss://{workspace_name}@onelake.dfs.fabric.microsoft.com/{lakehouse_name}.Lakehouse/Tables/{table_name}")

 

or if your objects names have special characters or whitespace, could use the id's:

 

df.write.format("delta").mode("overwrite").save(f"abfss://{workspace_id}@onelake.dfs.fabric.microsoft.com/{lakehouse_id}/Tables/{table_name}")

 

Ref.:

https://community.fabric.microsoft.com/t5/General-Discussion/Writing-to-a-Lakehouse-in-different-wor...

Helpful resources

Announcements
Expanding the Synapse Forums

New forum boards available in Synapse

Ask questions in Data Engineering, Data Science, Data Warehouse and General Discussion.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

April Fabric Update Carousel

Fabric Monthly Update - April 2024

Check out the April 2024 Fabric update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors