Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
julsr
Resolver III
Resolver III

Direct query vs Import Mode when complex DAX is used

Hello Everyone,

 

I’m currently using Dataflows to build my visualizations and reports. Each report automatically creates a semantic model that includes the same data from the Dataflow, plus any additional columns or measures I’ve added (using DAX).

My goal is to update the data only once so that when the Dataflow is refreshed, the semantic model automatically reflects the latest data. Right now, I have to refresh the Dataflow first, and then—once it’s complete—trigger a separate refresh for the semantic model (via a scheduled refresh in Power BI).

I’d like to streamline this into a single update. I’ve explored using DirectQuery, but it doesn’t support many of the DAX functions I rely on in Import mode.

Is there a better approach to achieve a single, synchronized refresh while preserving my DAX capabilities?

 

Thank you!

1 ACCEPTED SOLUTION
GeraldGEmerick
Solution Sage
Solution Sage

@julsr One option would be to simply take your dataflow Power Query code and use that in your import queries. Then you are just refreshing a single time. The other option would be to land your dataflows in a Fabric warehouse/lakehouse and use Direct Lake. DirectQuery is going to be limited and slow. Direct Lake should be comparable to import but somewhat slower.

View solution in original post

8 REPLIES 8
julsr
Resolver III
Resolver III

Thank you all! Everyone gave me insights on how to solve this problem!

V-yubandi-msft
Community Support
Community Support

Hi @julsr ,

Just checking  is your issue resolved, or are you still facing any difficulties? If you need any additional information or clarification, please feel free to reach out.


Thank you.

MohdZaid_
Frequent Visitor

Hey @julsr , 

 

To streamline refreshes while preserving DAX in Import mode, use Composite Models with your Dataflow as a source in Import mode. Then, enable “Scheduled Refresh” for the dataset that uses the Dataflow. Power BI automatically queues the Dataflow refresh before the dataset refresh, ensuring the semantic model always reflects the latest data without manual intervention.

 

 

Key points:

  • Keep the dataset in Import mode to retain full DAX functionality.

  • Schedule dataset refresh, not the Dataflow separately.

  • Power BI handles dependencies, refreshing the Dataflow first, then the semantic model.

This approach provides a single, synchronized refresh while keeping your DAX measures intact.

 

If it solved your issue, feel free to mark it as the solution so others can benefit too.

 

Thanks for being part of the community.

Power BI automatically queues the Dataflow refresh before the dataset refresh

That is incorrect.  You need to orchestrate that yourself, via Power Automate or Fabric Data Pipelines.

Actually, while the initial response mentions Power BI automatically queuing the Dataflow refresh before the dataset refresh, it's important to clarify that, in some scenarios, Power Query-based solutions can offer more control over refresh orchestration especially when using Power Query instead of Dataflows.

 

Since Dataflows are indeed based on Power Query, you can leverage the same techniques and scheduling capabilities within Power BI to synchronize the refreshes effectively. In cases where you need more advanced orchestration (such as dependencies or triggering events), Power Automate or Fabric Data Pipelines would indeed be the next step for more complex workflows.

V-yubandi-msft
Community Support
Community Support

Hi @julsr ,

Could you provide an update on your issue? Please review the helpful information shared by the community members @GeraldGEmerick @lbendlin  and let us know if you need more details or further assistance.

 

Thank you for your response, @GeraldGEmerick and @lbendlin .

lbendlin
Super User
Super User

Use Occam's Razor as a guide.  If you  cannot articulate what benefit the dataflow brings in your scenario - eliminate the dataflow and use a "golden" semantic model instead.

GeraldGEmerick
Solution Sage
Solution Sage

@julsr One option would be to simply take your dataflow Power Query code and use that in your import queries. Then you are just refreshing a single time. The other option would be to land your dataflows in a Fabric warehouse/lakehouse and use Direct Lake. DirectQuery is going to be limited and slow. Direct Lake should be comparable to import but somewhat slower.

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.