Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
carlwikstrom98
New Member

Dataflow connect to semantic model

Hi,

I am trying out the fabrics data agent a bit and have a question regarding connecting to a fitlered data source. Right now I am connecting to a semantic model but I would like to connect to a filtered version of this semantic model. My thought was to just use a dataflow and fitler the model but I can not find if this is possible. If not, how could I do this? I guess I could import the semantic model to a report in desktop and do the work there but is there a more efficient way?

1 ACCEPTED SOLUTION

Hi @carlwikstrom98,

 

Thank you for reaching out to us.

We understand you want to connect to a filtered version of your semantic model using a Dataflow, and use the measures and transformations already set up in your Fabric semantic model.

At this time, Dataflows can’t connect directly to semantic models. Dataflows are mainly used to bring in raw data from sources like SQL databases, Lakehouses, or files. Since your semantic model is already getting data from SQL Server and includes key logic and measures in Fabric, suggesting the below points:

 

Export to Lakehouse:

You can use a notebook or pipeline to query your semantic model (applying filters as needed) and export the filtered data to a Lakehouse table. Then, you can point your Dataflow to the Lakehouse as the data source. This approach keeps your transformations within Fabric and avoids duplicating logic.

 

Query via XMLA Endpoint:

If you want to query the semantic model directly, use a notebook or pipeline to connect through the XMLA endpoint. You can write a DAX or MDX query with filters, then load the filtered results into a supported destination (like Lakehouse or Warehouse), which your Dataflow can then use.

 

Importing the semantic model into Power BI Desktop and applying filters is another option, but it’s manual and not ideal for automated or scalable refreshes.

View solution in original post

3 REPLIES 3
burakkaragoz
Community Champion
Community Champion

Hi @carlwikstrom98 ,

You can't connect Dataflows directly to semantic models - they're designed for raw data sources, not processed models.

Your options:

Go to the source: Connect your Dataflow to whatever feeds your semantic model (SQL DB, lakehouse, etc.) and apply filters there. Skip the semantic model entirely.

Lakehouse workaround: Export your filtered semantic model data to a lakehouse table first, then point your Dataflow at that table.

Data pipeline route: Use a pipeline to pull from your semantic model (via XMLA) and dump it somewhere your Dataflow can reach.

Reality check: If you're just trying to get filtered data to a data agent, going through Dataflows might be overkill. You could probably feed the filtered data directly from a notebook or pipeline.

What's feeding your semantic model originally? That's probably where you want to connect and do your filtering.


If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.

Hi,

Thank you for your response.

I understand, I hoped it could be the easiest way so solve my situation. The semantic model is getting data from an SQL server, however it is many tables and measures created in fabrics that I would like to include. 

Hi @carlwikstrom98,

 

Thank you for reaching out to us.

We understand you want to connect to a filtered version of your semantic model using a Dataflow, and use the measures and transformations already set up in your Fabric semantic model.

At this time, Dataflows can’t connect directly to semantic models. Dataflows are mainly used to bring in raw data from sources like SQL databases, Lakehouses, or files. Since your semantic model is already getting data from SQL Server and includes key logic and measures in Fabric, suggesting the below points:

 

Export to Lakehouse:

You can use a notebook or pipeline to query your semantic model (applying filters as needed) and export the filtered data to a Lakehouse table. Then, you can point your Dataflow to the Lakehouse as the data source. This approach keeps your transformations within Fabric and avoids duplicating logic.

 

Query via XMLA Endpoint:

If you want to query the semantic model directly, use a notebook or pipeline to connect through the XMLA endpoint. You can write a DAX or MDX query with filters, then load the filtered results into a supported destination (like Lakehouse or Warehouse), which your Dataflow can then use.

 

Importing the semantic model into Power BI Desktop and applying filters is another option, but it’s manual and not ideal for automated or scalable refreshes.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.