Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
elissandrorosa
New Member

Origin and destination with lakehouse and semantic model

Hi friends!

Has anyone ever encountered the need to map the source tables of a semantic model?

My point is that there are some semantic models that directly query Lakehouse tables, including with SQL queries.

I would like to map this in some automated way, in order to optimize semantic model updates only after the table is updated in Lakehouse.

For example, a semantic model, queries data in the sales, customer, and product tables.

I can list the queries through queryinsights and get the tables by the command column, but I don't know who owns this query to make the link.

If anyone knows a way or a system table/view that can support this, I'd appreciate it.

Cheers!

5 REPLIES 5
tayloramy
Memorable Member
Memorable Member

Hi @elissandrorosa

 

I'm not sure I understand the question. 

If the semantic model is using directlake mode, it pulls data directly from the delta parquet files in the lakehouse. 

 

What mapping are you needing to do? 

 

There are APIs you can use to refresh direct query and import mode semantic models, and you can also refresh them from a data pipeline. 

Semantic model refresh activity in Data Factory for Microsoft Fabric - Microsoft Fabric | Microsoft ...


If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
 

Hello. It's not a directlake. When I create a dashboard in Power BI desktop and use the endpoint as a source, with SQL queries. Just like the API, there's a way to search for dataflows to dataset; it would be lakehouse table to dataset.

Hi @elissandrorosa ,

At the moment, Fabric doesn’t provide a ready-made table or view that shows Lakehouse table dataset.” But you do have a few options:

You can connect to the model through XMLA or the Fabric REST APIs and read the SQL queries it uses. From there, you can parse which Lakehouse tables are referenced.

You’ve already seen queries in QueryInsights. You can try matching those queries with the dataset ID or workspace metadata in the logs to know which dataset ran them

Many teams create their own mapping by parsing the semantic model definition and storing table → dataset relationships. That way, when a Lakehouse table changes, you know which models to refresh.

You can trigger semantic model refresh only when specific tables are updated using the Semantic Model Refresh activity in Data Factory pipelines.

Thank you.

Hi @elissandrorosa ,
Thanks for reaching out to the Microsoft fabric community forum. 

 

I would also take a moment to thank @tayloramy , for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference.

 

For Direct Lake models the queries are not stored, as they read directly from Delta/Parquet files in the Lakehouse. That’s why you won’t find a mapping table/view for this.

The best approach is to use the Fabric APIs or Data Pipeline to control and automate semantic model refresh.

If I misunderstand your needs or you still have problems on it, please feel free to let us know.   

Best Regards, 
Community Support Team.

Hi @elissandrorosa ,

I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.

Best Regards, 
Community Support Team .

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors