Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
Hello,
We are currently working with Microsoft Fabric using Git integration through Azure DevOps. I have a couple of questions:
When I branch out to a different workspace, create a Lakehouse and a Semantic Model, and then merge back to the main branch, the Semantic Model in the main branch still refers to the Lakehouse in the other workspace. Is this expected behavior? If Yes; how do we ensure that the Connection to Lakehouse shifts to the main workspace during the CI Process?
Previously, when creating Semantic Models, I noticed they used SQL Connectors, but now they show AzureDataLakeConnector. Could anyone point me to documentation explaining this change and how it might impact existing workflows or integrations?
Thank you!
Solved! Go to Solution.
Hi @AdarshChekodu,
Thanks for the additional questions.
Now, there isn’t an automated migration path available to switch an existing semantic model from the SQL Endpoint connector to the new Lake-based connector. Moving to the new connector typically requires creating a new semantic model based on the Lakehouse. We understand this can require planning from a deployment and governance perspective.
Regarding support going forward, both connectors continue to be supported today. The new Lake-based connector is part of the long-term direction in Fabric, especially for Direct Lake and Lake-centric scenarios, but there has not been an announcement about deprecating the SQL-endpoint connector. If there are any changes to support policies in the future, those will be communicated through the usual Microsoft channels and documentation.
Thank you again for using the Microsoft Fabric Community Forum.
Hi @AdarshChekodu,
Thank you for reaching out to the Microsoft Fabric Community Forum. Also, thanks to @Srisakthi, for his inputs on this thread.
Has your issue been resolved? If the response provided by the community member @Srisakthi, addressed your query, could you please confirm? It helps us ensure that the solutions provided are effective and beneficial for everyone.
Hope this helps clarify things and let me know what you find after giving these steps a try happy to help you investigate this further.
Thank you for using the Microsoft Community Forum.
Hey; thanks @Srisakthi 's answer kind of works for point 1 any feedback on point 2 of my query?
Thanks alot
Hi @AdarshChekodu ,
I'm glad that helped!. Sorry I have not tried anything on point 2. so i cannot comment anything here. One thing which I have come across is we need to use sql endpoint connection to lakehouse if you have queries in your semantic model.
Regards,
Srisakthi
Hi @AdarshChekodu,
Thanks for getting back and glad to hear the suggestion shared earlier by @Srisakthi, helped with your first point.
Regarding your second question about the connector change: In Fabric, semantic models created from a Lakehouse are gradually moving to use the Azure Data Lake based connection (including Direct Lake scenarios), instead of the older SQL connector approach. This is expected behaviour as Fabric is shifting towards a more lake-native architecture for semantic models.
This shouldn’t negatively impact your existing models but depending on how your current CI/CD process is set up, you may need to review and update any scripts or deployment steps that assume the older SQL connector.
Refer these links:
1. https://learn.microsoft.com/en-us/fabric/data-warehouse/semantic-models
2. https://learn.microsoft.com/en-us/fabric/fundamentals/direct-lake-overview
If you have any concerns around how this affects your current deployment workflows, feel free to share more details and we can help investigate on that.
Thank you.
@v-kpoloju-msft
Thanks alot for the response.
Is there a plan from microsoft to help us migrate from the old Connector "SQL Endpoint" to the new one? Or will we have to re-create the entire semaic model with the new connector?
Also in future will there be features that will support only the new connectors? Or will the old connector be "Out of Support"?
Thanks,
Adarsh
Hi @AdarshChekodu,
Thanks for the additional questions.
Now, there isn’t an automated migration path available to switch an existing semantic model from the SQL Endpoint connector to the new Lake-based connector. Moving to the new connector typically requires creating a new semantic model based on the Lakehouse. We understand this can require planning from a deployment and governance perspective.
Regarding support going forward, both connectors continue to be supported today. The new Lake-based connector is part of the long-term direction in Fabric, especially for Direct Lake and Lake-centric scenarios, but there has not been an announcement about deprecating the SQL-endpoint connector. If there are any changes to support policies in the future, those will be communicated through the usual Microsoft channels and documentation.
Thank you again for using the Microsoft Fabric Community Forum.
Hi @AdarshChekodu,
Just checking in to see if the issue has been resolved on your end. If the earlier suggestions helped, that’s great to hear! And if you’re still facing challenges, feel free to share more details happy to assist further.
Thank you.
Hi @AdarshChekodu ,
For the point 1 - Parameterize your semantic model and post deployment you have to execute update parameters using rest api from ADO to point to your actual branch lakehouse.
https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/update-parameters
Regards,
Srisakthi
@Srisakthi ; Thanks alot for the reply. I was thinking would there be anything that comes default with Fabric. using ADO is the best option as of now I guess!