Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Hi,
I am facing a difficulty with updating a report, which uses onelake datahub and direct query.
I created a report R2, which uses OneLake Data Hub(Power BI semantic model) S1 connector and added Direct Query connected data S2 from another source, which forced me to create "local model".
I thought that if I now did some semantic model updates on the S1 (and published it), it would show up on the R2 report online immediately. But it seems that it doesn't.
When I e.g. change the data type of a certain column in the S1 and I want this change to be visible in the R2, I have to open R2 in the desktop, refresh it and then publish it again.
This seems stupid. I would have expected, that a change to S1 after publishing would immediately take effect on the R2 online (same as if I had just a regular thin report only using the S1). But it doesn't.
Any advice? I don't want to spend time on opening the "thin reports" that combine multiple sources and refreshing them and republishing every time, I have some change done in the main report/semantic model.
Any advice? or is this just the stupid native behavior?
Solved! Go to Solution.
I would have expected, that a change to S1 after publishing would immediately take effect on the R2 online
That is only true for data, and for varying interpretations of "immediately".
Meta data changes need to be propagated.
This behavior is indeed a limitation when using composite models that combine DirectQuery and a live connection (like OneLake Data Hub's semantic model). Here’s what’s happening and some options to consider:
1. Local Model Constraints: When you add DirectQuery sources to a report connected to a semantic model, Power BI creates a local model. Unfortunately, the local model does not automatically sync with updates made to the original semantic model. This limitation requires reopening, refreshing, and republishing the report to reflect any updates made to S1.
2. Alternative Approaches:
- Use DirectQuery Only: If feasible, consider sourcing all data via DirectQuery to avoid needing a local model. This allows your report to respond to live changes in connected data sources.
- Automate Desktop Refresh and Republish: Power Automate Desktop or other scheduling tools can help automate opening, refreshing, and publishing reports. While it doesn’t replace live syncing, it can minimize manual intervention.
- Dataflows as Intermediate Layer: If it’s feasible, use Dataflows for common data sources in S1 and S2. Then, have R2 pull from these Dataflows, which allows faster refreshes and somewhat centralizes updates.
Unfortunately, native behavior currently requires manual intervention for reports with mixed live and DirectQuery sources. Hopefully, future updates will improve integration across composite models with these different data sources.
I would have expected, that a change to S1 after publishing would immediately take effect on the R2 online
That is only true for data, and for varying interpretations of "immediately".
Meta data changes need to be propagated.
So the only way to propagate the metadata change the desktop has to be refreshed? no other way to do it online from service?
Sort of. You also have ALM Toolkit and Tabular Editor that can push meta data changes.
Would you say that it is more comfortable than using the desktop report itself? anyway, thanks for the reply.
Not more comfortable, but less invasive. A prime example is a semantic model that includes an incremental refresh data source. When you make changes to that model in the Desktop and then publish, this will reset (ie destroy) all your incremental refresh partitions and will attempt to refill them. Not only is that wasteful, it may also time out.
So instead you can use ALM Toolkit to propagate only the meta data changes without killing the partition definitions.
Thank you. When you mention Incremental Refresh, is it possible to download via ALM toolkit or Tabular Editor the full report (that has incremental refresh) from the service? Some time ago I was discussing with somebody that it is annoying that you cannot download it even in the template format through regular channels.
Those tools are for meta data manipulation. They are not data extraction tools.
What you are referring to are the PBIX download limitations. While that list is long, it has gotten shorter recently.
HOWEVER. Incremental Refresh is based on having multiple partitions per table. Power BI Desktop has no concept of that. So even if you were allowed to download a PBIX with incremental refresh data sources you still could not open it in Power BI Desktop.
User | Count |
---|---|
34 | |
16 | |
13 | |
10 | |
8 |
User | Count |
---|---|
52 | |
31 | |
24 | |
17 | |
15 |