Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hello community,
We are trying to use a MS Fabric lakehouse as an external data source within a Fabric SQL database. The goal is to create external tables that reference lakehouse data so we can use them in stored procedures, views, and other SQL objects.We have attempted several approaches but haven’t been able to get this working as expected. Has anyone successfully set up this kind of integration, and if so, could you share the steps or best practices? Any guidance or examples would be greatly appreciated.
Thank you
@Kristian_nho During Ignite we announced the Public Preview of Data Virtualization for Fabric SQL Database, the end goal is offer a solution for the very problem you described. You can check the capabilities, demos and requirements here:
Data Virtualization in SQL Database - Microsoft Fabric | Microsoft Learn
Basically to can do OPENROWSET, OPENROWSET + Views or External Tables itself, depending on your requirements.
Hi Kristian,
I haven't tested this out but you might find this link interesting:
https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view...
In theory you can create a link to underlying data storage and use the OPENROWSET to do load the data into a table in SQL Server. Unfortunately you can't create an external table on top of a Blob Storage. It is just for doing bulk actions.
Btw, it would be nice if we could do what you've asked for, I suggest adding this as an idea to the product team.
Best
Onur
😊 If this post helped you, feel free to give it some Kudos! 👍
✅ And if it answered your question, please mark it as the accepted solution.
Hi @Kristian_nho ,
You can work with Lakehouse data in a Fabric SQL Database today, but not through a fully general availability external table feature yet. The supported options are:
1. Query via the Lakehouse SQL analytics endpoint
Any Lakehouse data is automatically exposed through its SQL analytics endpoint. You can build views, stored procedures, and other SQL objects directly against that endpoint.
2. Use Data Virtualization (preview)
If enabled in your tenant, you can define external data sources pointing to OneLake, set up file formats (Delta, Parquet, CSV, JSON), and create external tables. Some regions also support OPENROWSET / external data source patterns to query Delta/Parquet files in OneLake without ingestion.
External tables are read‑only and tuned for analytical workloads.
Use Managed Identity for secure authentication.
Align schemas carefully to avoid mismatches.
Inserts/updates aren’t supported yet.
3. Ingest into SQL DB/Warehouse
If virtualization isn’t available, the fallback is to load Lakehouse tables into the SQL DB (e.g. with CTAS or INSERT INTO) and then build your SQL objects on top.
There’s no general‑availability “external table to Lakehouse” feature yet. Depending on what’s enabled in your workspace, either use the SQL analytics endpoint, data virtualization (preview), or ingestion into SQL DB/Warehouse.
Hope this helps.
Thank you.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!