The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
I am trying out the fabrics data agent a bit and have a question regarding connecting to a fitlered data source. Right now I am connecting to a semantic model but I would like to connect to a filtered version of this semantic model. My thought was to just use a dataflow and fitler the model but I can not find if this is possible. If not, how could I do this? I guess I could import the semantic model to a report in desktop and do the work there but is there a more efficient way?
Solved! Go to Solution.
Hi @carlwikstrom98,
Thank you for reaching out to us.
We understand you want to connect to a filtered version of your semantic model using a Dataflow, and use the measures and transformations already set up in your Fabric semantic model.
At this time, Dataflows can’t connect directly to semantic models. Dataflows are mainly used to bring in raw data from sources like SQL databases, Lakehouses, or files. Since your semantic model is already getting data from SQL Server and includes key logic and measures in Fabric, suggesting the below points:
Export to Lakehouse:
You can use a notebook or pipeline to query your semantic model (applying filters as needed) and export the filtered data to a Lakehouse table. Then, you can point your Dataflow to the Lakehouse as the data source. This approach keeps your transformations within Fabric and avoids duplicating logic.
Query via XMLA Endpoint:
If you want to query the semantic model directly, use a notebook or pipeline to connect through the XMLA endpoint. You can write a DAX or MDX query with filters, then load the filtered results into a supported destination (like Lakehouse or Warehouse), which your Dataflow can then use.
Importing the semantic model into Power BI Desktop and applying filters is another option, but it’s manual and not ideal for automated or scalable refreshes.
Hi @carlwikstrom98 ,
You can't connect Dataflows directly to semantic models - they're designed for raw data sources, not processed models.
Your options:
Go to the source: Connect your Dataflow to whatever feeds your semantic model (SQL DB, lakehouse, etc.) and apply filters there. Skip the semantic model entirely.
Lakehouse workaround: Export your filtered semantic model data to a lakehouse table first, then point your Dataflow at that table.
Data pipeline route: Use a pipeline to pull from your semantic model (via XMLA) and dump it somewhere your Dataflow can reach.
Reality check: If you're just trying to get filtered data to a data agent, going through Dataflows might be overkill. You could probably feed the filtered data directly from a notebook or pipeline.
What's feeding your semantic model originally? That's probably where you want to connect and do your filtering.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was assisted by AI for translation and formatting purposes.
Hi,
Thank you for your response.
I understand, I hoped it could be the easiest way so solve my situation. The semantic model is getting data from an SQL server, however it is many tables and measures created in fabrics that I would like to include.
Hi @carlwikstrom98,
Thank you for reaching out to us.
We understand you want to connect to a filtered version of your semantic model using a Dataflow, and use the measures and transformations already set up in your Fabric semantic model.
At this time, Dataflows can’t connect directly to semantic models. Dataflows are mainly used to bring in raw data from sources like SQL databases, Lakehouses, or files. Since your semantic model is already getting data from SQL Server and includes key logic and measures in Fabric, suggesting the below points:
Export to Lakehouse:
You can use a notebook or pipeline to query your semantic model (applying filters as needed) and export the filtered data to a Lakehouse table. Then, you can point your Dataflow to the Lakehouse as the data source. This approach keeps your transformations within Fabric and avoids duplicating logic.
Query via XMLA Endpoint:
If you want to query the semantic model directly, use a notebook or pipeline to connect through the XMLA endpoint. You can write a DAX or MDX query with filters, then load the filtered results into a supported destination (like Lakehouse or Warehouse), which your Dataflow can then use.
Importing the semantic model into Power BI Desktop and applying filters is another option, but it’s manual and not ideal for automated or scalable refreshes.
User | Count |
---|---|
18 | |
10 | |
6 | |
3 | |
3 |
User | Count |
---|---|
46 | |
22 | |
17 | |
13 | |
12 |