Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Ask the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.

Reply
DanielleJvR
New Member

How to query data directly from on-premises SQL Server using Microsoft Fabric

Hi there,

 

I have a use case where certain client data is too sensitive and they are trying to avoid sensitive data being stored anywhere else other than on their on-premises SQL Server. The client is looking at Microsoft Fabric as a solution for their data needs, and I am trying to see if we can import some of the data into a data lakehouse (for future ML models), and then direct query the more sensitive data. But I am struggling to find a solution in Fabric that will accommodate this requirement.

 

Using Shortcuts to create a Direct Lake doesn't seem to be a viable solution, as SQL Server isn't an available source option to choose.

 

I am very new to Data Engineering and to Microsoft Fabric, so it would be greatly appreciated if someone has an answer on this :).

 

Thanks in advance!

1 ACCEPTED SOLUTION
SudhavaniKolla3
Helper II
Helper II

Hi,

so as per my understanding, i am gving this answer and also i replicated the scenarion on my local PBI desktop.

so you are pushing final table to lakehouse after applying tansfermations in datadlow and creating semantic model on top of the final tbale in lakehouse and trying to create PBI reports.

instead of creating semantic model on top of final tables in lakehouse, just open PBI desktop and goto Onelake data hub and select lakehouse as source and select required tables from required lakehouse and same way select from SQL server(select required table/write query), so that the sematic model will create in DQ mode insted of live(which is almost similar to live mode only, but it allows querying from SQL serverb data as well)

please refer my images for your reference.

 

SudhavaniKolla3_0-1730973371467.png

SudhavaniKolla3_1-1730973446781.png

 

SudhavaniKolla3_2-1730973511494.png

 

SudhavaniKolla3_3-1730973536868.png

 

SudhavaniKolla3_4-1730973601916.png

 

SudhavaniKolla3_5-1730973650992.png

SudhavaniKolla3_6-1730973720261.png

SudhavaniKolla3_7-1730973790478.png

 

 

View solution in original post

7 REPLIES 7
SudhavaniKolla3
Helper II
Helper II

Hi,

so as per my understanding, i am gving this answer and also i replicated the scenarion on my local PBI desktop.

so you are pushing final table to lakehouse after applying tansfermations in datadlow and creating semantic model on top of the final tbale in lakehouse and trying to create PBI reports.

instead of creating semantic model on top of final tables in lakehouse, just open PBI desktop and goto Onelake data hub and select lakehouse as source and select required tables from required lakehouse and same way select from SQL server(select required table/write query), so that the sematic model will create in DQ mode insted of live(which is almost similar to live mode only, but it allows querying from SQL serverb data as well)

please refer my images for your reference.

 

SudhavaniKolla3_0-1730973371467.png

SudhavaniKolla3_1-1730973446781.png

 

SudhavaniKolla3_2-1730973511494.png

 

SudhavaniKolla3_3-1730973536868.png

 

SudhavaniKolla3_4-1730973601916.png

 

SudhavaniKolla3_5-1730973650992.png

SudhavaniKolla3_6-1730973720261.png

SudhavaniKolla3_7-1730973790478.png

 

 

SudhavaniKolla3
Helper II
Helper II

Hi,

so as per my understanding, i am gving this answer and also i replicated the scenarion on my local PBI desktop.

so you are pushing final table to lakehouse after applying tansfermations in datadlow and creating semantic model on top of the final tbale in lakehouse and trying to create PBI reports.

instead of creating semantic model on top of final tables in lakehouse, just open PBI desktop and goto Onelake data hub and select lakehouse as source and select required tables from required lakehouse and same way select from SQL server(select required table/write query), so that the sematic model will create in DQ mode insted of live(which is almost similar to live mode only, but it allows querying from SQL serverb data as well)

please refer my images for your reference.

 

SudhavaniKolla3_0-1730973371467.png

SudhavaniKolla3_1-1730973446781.png

 

SudhavaniKolla3_2-1730973511494.png

 

SudhavaniKolla3_3-1730973536868.png

 

SudhavaniKolla3_4-1730973601916.png

 

SudhavaniKolla3_5-1730973650992.png

SudhavaniKolla3_6-1730973720261.png

SudhavaniKolla3_7-1730973790478.png

 

 

Anonymous
Not applicable

Hi @DanielleJvR ,

Here are some of my personal thoughts on your question:

 

1. Dynamic Data Masking (DDM).

This feature allows you to mask sensitive data in your Fabric Warehouse and Lakehouse SQL Endpoints. DDM ensures that sensitive information is not exposed to unauthorized users, which can help you comply with data privacy regulations without altering your core data structure.      

You can look at this document below: 

Announcing: Dynamic Data Masking for Fabric Warehouse and Lakehouse SQL Endpoint | Microsoft Fabric ...

 

2. Data Loss Prevention (DLP) Policies.

Microsoft Purview's DLP policies can automatically detect sensitive information as it is uploaded into lakehouses and take remediation actions. This helps in adhering to regulations like GDPR and HIPAA.

Here is the document you can read:

Announcement: Microsoft Purview Data Loss Prevention policies have been extended to Fabric lakehouse...

 

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

FabianSchut
Super User
Super User

Instead of leaving the sensitive data in the on-premises sources, you could also consider restricting access to Fabric and the data within. You could take a look at Private Link: https://blog.fabric.microsoft.com/en-us/blog/announcing-azure-private-link-support-for-microsoft-fab...

lbendlin
Super User
Super User

So they want to keep all their data in their on-prem SQL server instance and they also want to have the data in Fabric?

 

Would they instead consider using their on-Prem SQL server data source as a Direct Query data source in their Power BI reports?

They would want to keep the sensitive data on-primesis, and move the rest of the data into Fabric. We have suggested direct quering the sensitive data, but I can't seem to get this right in a demo I'm doing.

 

I've created an on-premises gateway connection to the SQL Server, moved some data into a fabric lakehouse, created a dataflow for some transformations, and created a semantic model to report from. From here, there is no option to direct query another source. So I downloaded the Power BI file, as there is more functionality in PBI Desktop, and tried using a direct query from PBI Desktop, but then I get an error saying that I need to change my model to enable Direct Query. 

 

From the start of the project, right up to where I am now, I didn't see anywhere where I had the option to do this.

I hope this makes sense?

and created a semantic model to report from

Instead of using that I would create the semantic model in Power BI Desktop.  Connect to the on-prem source in Direct Query mode, then connect to the lakehouse's SQL endpoint in the same way, then add local tables as needed.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors