Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
HarderT
Helper II
Helper II

Error Data Warehouse/Lakehouse

Hey together,

 

since last week our Data Warehouse connection had errors:

 

"Message=<ccon>Warning: Fatal error 615 occurred at May 26 2025 8:08AM. Note the error and time, and contact your system administrator.\r\nLogin failed for user '<token-identified principal>'.</ccon>, Code=21, State=1"

 

AND 

 

ErrorCode=SqlFailedToConnect,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot connect to SQL Database. Please contact SQL server team for further support. Server: 'xxx', Database: 'xxx', User: ''. Check the connection configuration is correct, and make sure the SQL Database firewall allows the Data Factory runtime to access.,Source=Microsoft.DataTransfer.Connectors.MSSQL,''Type=Microsoft.Data.SqlClient.SqlException,Message=Warning: Fatal error 615 occurred at May 22 2025 12:15AM. Note the error and time, and contact your system administrator.
Login failed for user '<token-identified principal>'.,Source=Framework Microsoft SqlClient Data Provider,'

 

I have restored the database, but the lakehouses have the same errors. When I look at the tables in the Lakehouse view, I see data but when I switch to the SQL endpoint, I no longer see any data and the error message appears. This is all since May 22nd, since the MS Fabric update.

 

Is there a Solution for the Lakehouse

Many thanks

1 ACCEPTED SOLUTION
Srisakthi
Continued Contributor
Continued Contributor

Hi @HarderT ,

 

You can utilise Semantic Link Labs to repoint to new lakehouse.

 

import sempy.fabric as fabric
from sempy_labs.directlake._get_shared_expression import get_shared_expression
from sempy_labs._helper_functions import (
    resolve_lakehouse_name,
)
from sempy_labs.tom import connect_semantic_model
from typing import Optional
import sempy_labs._icons as icons


def update_direct_lake_model_lakehouse_connection(
    dataset: str,
    workspace: Optional[str] = None,
    lakehouse: Optional[str] = None,
    lakehouse_workspace: Optional[str] = None,
😞
    """
    Remaps a Direct Lake semantic model's SQL Endpoint connection to a new lakehouse.

    Parameters
    ----------
    dataset : str
        Name of the semantic model.
    workspace : str, default=None
        The Fabric workspace name in which the semantic model exists.
        Defaults to None which resolves to the workspace of the attached lakehouse
        or if no lakehouse attached, resolves to the workspace of the notebook.
    lakehouse : str, default=None
        The Fabric lakehouse used by the Direct Lake semantic model.
        Defaults to None which resolves to the lakehouse attached to the notebook.
    lakehouse_workspace : str, default=None
        The Fabric workspace used by the lakehouse.
        Defaults to None which resolves to the workspace of the attached lakehouse
        or if no lakehouse attached, resolves to the workspace of the notebook.

    Returns
    -------

    """

    workspace = fabric.resolve_workspace_name(workspace)

    if lakehouse_workspace is None:
        lakehouse_workspace = workspace

    if lakehouse is None:
        lakehouse_id = fabric.get_lakehouse_id()
        lakehouse = resolve_lakehouse_name(lakehouse_id, lakehouse_workspace)

    # Check if lakehouse is valid
    dfI = fabric.list_items(workspace=lakehouse_workspace, type="Lakehouse")
    dfI_filt = dfI[(dfI["Display Name"] == lakehouse)]

    if len(dfI_filt) == 0:
        raise ValueError(
            f"{icons.red_dot} The '{lakehouse}' lakehouse does not exist within the '{lakehouse_workspace}' workspace. "
            f"Therefore it cannot be used to support the '{dataset}' semantic model within the '{workspace}' workspace."
        )

    dfP = fabric.list_partitions(dataset=dataset, workspace=workspace)
    dfP_filt = dfP[dfP["Mode"] == "DirectLake"]

    if len(dfP_filt) == 0:
        raise ValueError(
            f"{icons.red_dot} The '{dataset}' semantic model is not in Direct Lake. This function is only applicable to Direct Lake semantic models."
        )
    else:
        with connect_semantic_model(
            dataset=dataset, readonly=False, workspace=workspace
        ) as tom:

            shEx = get_shared_expression(lakehouse, lakehouse_workspace)
            try:
                tom.model.Expressions["DatabaseQuery"].Expression = shEx
                print(
                    f"{icons.green_dot} The expression in the '{dataset}' semantic model has been updated to point to the '{lakehouse}' lakehouse in the '{lakehouse_workspace}' workspace."
                )
            except Exception as e:
                raise ValueError(
                    f"{icons.red_dot} The expression in the '{dataset}' semantic model was not updated."
                ) from e
 
Try the above code. Simalarly you can repoint the reports
 
Regards,
Srisakthi
 
If this helps, please mark is Accept As Solution 

View solution in original post

9 REPLIES 9
v-vpabbu
Community Support
Community Support

Hi @HarderT,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Regards,
Vinay Pabbu

Hi @HarderT,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Regards,
Vinay Pabbu

Hi @v-vpabbu,

 

the Problem now is when we change the connection it creates a new report and does not overwrite the old one.

Srisakthi
Continued Contributor
Continued Contributor

Hi @HarderT ,

 

You can utilise Semantic Link Labs to repoint to new lakehouse.

 

import sempy.fabric as fabric
from sempy_labs.directlake._get_shared_expression import get_shared_expression
from sempy_labs._helper_functions import (
    resolve_lakehouse_name,
)
from sempy_labs.tom import connect_semantic_model
from typing import Optional
import sempy_labs._icons as icons


def update_direct_lake_model_lakehouse_connection(
    dataset: str,
    workspace: Optional[str] = None,
    lakehouse: Optional[str] = None,
    lakehouse_workspace: Optional[str] = None,
😞
    """
    Remaps a Direct Lake semantic model's SQL Endpoint connection to a new lakehouse.

    Parameters
    ----------
    dataset : str
        Name of the semantic model.
    workspace : str, default=None
        The Fabric workspace name in which the semantic model exists.
        Defaults to None which resolves to the workspace of the attached lakehouse
        or if no lakehouse attached, resolves to the workspace of the notebook.
    lakehouse : str, default=None
        The Fabric lakehouse used by the Direct Lake semantic model.
        Defaults to None which resolves to the lakehouse attached to the notebook.
    lakehouse_workspace : str, default=None
        The Fabric workspace used by the lakehouse.
        Defaults to None which resolves to the workspace of the attached lakehouse
        or if no lakehouse attached, resolves to the workspace of the notebook.

    Returns
    -------

    """

    workspace = fabric.resolve_workspace_name(workspace)

    if lakehouse_workspace is None:
        lakehouse_workspace = workspace

    if lakehouse is None:
        lakehouse_id = fabric.get_lakehouse_id()
        lakehouse = resolve_lakehouse_name(lakehouse_id, lakehouse_workspace)

    # Check if lakehouse is valid
    dfI = fabric.list_items(workspace=lakehouse_workspace, type="Lakehouse")
    dfI_filt = dfI[(dfI["Display Name"] == lakehouse)]

    if len(dfI_filt) == 0:
        raise ValueError(
            f"{icons.red_dot} The '{lakehouse}' lakehouse does not exist within the '{lakehouse_workspace}' workspace. "
            f"Therefore it cannot be used to support the '{dataset}' semantic model within the '{workspace}' workspace."
        )

    dfP = fabric.list_partitions(dataset=dataset, workspace=workspace)
    dfP_filt = dfP[dfP["Mode"] == "DirectLake"]

    if len(dfP_filt) == 0:
        raise ValueError(
            f"{icons.red_dot} The '{dataset}' semantic model is not in Direct Lake. This function is only applicable to Direct Lake semantic models."
        )
    else:
        with connect_semantic_model(
            dataset=dataset, readonly=False, workspace=workspace
        ) as tom:

            shEx = get_shared_expression(lakehouse, lakehouse_workspace)
            try:
                tom.model.Expressions["DatabaseQuery"].Expression = shEx
                print(
                    f"{icons.green_dot} The expression in the '{dataset}' semantic model has been updated to point to the '{lakehouse}' lakehouse in the '{lakehouse_workspace}' workspace."
                )
            except Exception as e:
                raise ValueError(
                    f"{icons.red_dot} The expression in the '{dataset}' semantic model was not updated."
                ) from e
 
Try the above code. Simalarly you can repoint the reports
 
Regards,
Srisakthi
 
If this helps, please mark is Accept As Solution 

Hi @HarderT,

 

Thats the Default behaviour. Power BI treats it as a new data source, which results in a new report being created instead of overwriting the existing one.
if you're using Power BI Desktop, you can try updating the connection string from within the file and republish it, which should overwrite the existing report.

 

Regards,

Vinay Pabbu

HarderT
Helper II
Helper II

@v-vpabbu  thanks for your quick respond. But I need a quicker solution. 

 

Is there any way, where I can rebind a semantic model from one lakehouse to an other?

 

Because I want to save myself the work of recreating everything. That's why I want to create a new lakehouse and connect the source data there and then convert the semantic model on which reports are already based to it

Hi @HarderT,

 

Yes, you can rebind the semantic model to a new Lakehouse. Just make sure the new Lakehouse has the same table names and structure. Then open the semantic model in Power BI Desktop, go to Data source settings, and point it to the new Lakehouse’s SQL endpoint. After that, publish it back.

 

Regards,

Vinay Pabbu

v-vpabbu
Community Support
Community Support

Hi @HarderT,

 

Thank you for reaching out to Microsoft Fabric Community Forum.

 

At this time, there is no option in Fabric to disable or re-enable the SQL endpoint manually, and the issue requires backend support from Microsoft to resolve.

As a temporary workaround, copy your tables to a new Lakehouse using a Spark notebook, which will reinitialize the SQL endpoint properly.

 

Since error 615 is a fatal metadata/page-level error and the SQL endpoint is broken even though data is visible in Lakehouse Explorer, it likely needs backend repair by the Microsoft Fabric team.

Please raise a support ticket with Microsoft, referencing the exact error message and timestamp.
https://learn.microsoft.com/en-us/power-bi/support/create-support-ticket 

 


If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it!


Regards,
Vinay Pabbu

Hi @HarderT,

 

As we haven't heard back from you, I hope you have raised a support ticket. At this time we are closing this thread. If you have any further issues, please start a new thread in the community forum, and we are here to assist you. Thankyou for your understanding and continuous support.
Thank you for being part of the Microsoft Fabric Community.

 

Regards,

Vinay Pabbu

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.