Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi, When using Fabric Lakehouse as a source in ADF dataflows, there is no ability to pass a filter clause back to source to restrict the data flowing into the Dataflows. A similar operation on any regular database works fine.
We tried using Lakehouse Views as a workaround, they do not seem to be visible as a source in a dataflow either. Lakehouse Schemas are enabled, and could be the reason, but we're trying to avoid having to recreating a lakehouse without Schema support.
This severly impacts the abiltiy to restrict the number of rows flowing into the ADF data flows. We have some very large tables, and this is a problem area for us. Are we missing a trick? Any suggestions or is this a known issue?
Solved! Go to Solution.
Is there a SQL connector in ADF dataflows?
Could you copy the Lakehouses' SQL connection string, and paste it in the SQL connector in ADF dataflow? Sounds like something which would allow for passing filters in the query.
I'm hoping you won't get issues with SQL Analytics Endpoint update delays.
Hi @SidTheSloth ,
The data can be preprocessed first with the help of an activity in the data pipeline, and then the result is used as a data source for dataflow gen2 to connect the two activities.
Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn
Best Regards,
Adamk Kong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thank you, while pre-processing the data to create a new can be a viable workaround, it adds to the processing load and overall run time. Considering parquets with hundreds of millions of rows, this is a lot of time.
I additionally want to clarify this is ADF pipelines not Fabric Data Factory pipelines. As a part of the newly added functionality, Ideally if we were able to filter at source(like we can with other SQL based sources), that would be the right way to do things. The inability to query views from dataflows (for Lakehouses with Schemas), compounds the problem.
Appreciate your time. Thank you.
Is there a SQL connector in ADF dataflows?
Could you copy the Lakehouses' SQL connection string, and paste it in the SQL connector in ADF dataflow? Sounds like something which would allow for passing filters in the query.
I'm hoping you won't get issues with SQL Analytics Endpoint update delays.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
4 | |
3 | |
1 | |
1 | |
1 |