Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Mahhin_Shahzad1
Frequent Visitor

Help Me Optimize This Weird API-Based Near Real-Time Power BI Embed Refresh Solution I've Created

Hey Guys,

I have this amazingly weird Power BI solution that I created, and it is fully dynamic API-based refresh with Dynamic M paramets in source query for ( RLS)

TLDR
I set up a Power BI embedding solution that updates M parameters and does a full refresh on every view. It’s clever but slow, and now I have problems with this — one customer’s data at a time. Looking for something better with my requirements of speed, security, isolation, and cost.

The Situation:

We are budgeting /forecasting company , Users update budgets or forecasts in our system and need to see those changes reflected in an embedded Power BI report almost immediately. A scheduled refresh was not sufficient .

What I currently do:

  • Whenever someone wants to re-open a dashboard,

  • I update M parameters based on the most recent details,

  • Then I do a full dataset refresh, and then generate the dashboard embed token to use.

I’ve been led to this approach via ChatGPT and never got any real feedback on it. I even asked Microsoft support for help, but they Suggested to ask Community .

I'm not an expert, but I see many problems with this:

  • It takes a lot of time — for a big dataset, sometimes even 1 minute or more.

  • I have another external API delay process on top of that, which makes things worse. That’s a major concern.

  • Only one dataset exists - if refresh fails, clients can see each other's data

  • Also, there is only one dataset meaning only one forecast company data at a time and If refresh is not succesful then come person may end up seeing others old data .

My Concerns:

Doing a full dataset refresh each time is not good. Someone suggested I try incremental refresh, but I thought it wouldn’t work in my case since there is no time dimension and I thought because it recalculates entirely each time

Recently I read in the Kimball book and in it that for budgets, you ideally just store the item transaction that has changed or altered — and only they need to be updated.

My system is not optimized this way, but I think I can find some way to add that policy.

Even so, the issue remains: the newer dataset gets wiped first. That's not ideal. If there’s a way I could retain the old and then push the current forecast into a new label, and if that same is used, then update via that — and if it doesn’t exist, then I create new and keep the most updated one — that would reduce time as well.

This way, for the same forecast, it won’t rerun each time. Only then can I apply my incremental refresh logic.

Other Things I Need:

The last requirement is that I want to give edit access to the dashboard in my external app So they can put their own logos etc and Customize things a bit . I don’t understand this much either, but I think that would only be possible if I create some level of isolation — maybe a separate workspace for each client — and then have it in there.

I just want someone who has dealt with something similar to please guide me:

  • Am I even on the right path?

  • Should I consider things like Push Datasets, Hybrid models, etc.?

  • I don’t use Direct Query because it’s way too limited, and I can’t deal with all those restrictions.

So please suggest what could be done ideally in a situation like this. Any documentation, guidance, architecture reference, or keywords I should research further would be really appreciated. I feel lost and want to start reviewing and optimizing this approach properly and know the next steps .

3 REPLIES 3
v-sshirivolu
Community Support
Community Support

Hi  @Mahhin_Shahzad1 ,
Thank you for reaching out to Microsoft Fabric community.Thank you for sharing an overview of your current Power BI embedded solution.

Your implementation using dynamic M parameters, full dataset refreshes, and embed token generation effectively supports near real-time updates with row-level security. However, this approach presents challenges such as increased refresh latency, scalability limitations, and potential security risks in a multi-tenant environment.

Refreshing the entire dataset each time a user accesses a report can be inefficient, particularly with large data volumes. I suggest considering incremental refresh strategies; even without a traditional date column, you can leverage a surrogate key like Forecast ID or Version ID to partition data and optimize refresh processes.

To further enhance security and customization, you may want to isolate each client’s data by using separate workspaces or datasets. This approach supports tenant-specific branding and reduces the risk of data exposure in case of refresh failures.

For near real-time requirements, options such as Push Datasets or Hybrid Models (utilizing both DirectQuery and Import modes) can help minimize latency while maintaining model flexibility. If client-specific visual customization is needed, embedding with the Power BI JavaScript API allows for dynamic theming and interactive features, especially in tenant-isolated setups.

Finally, automating the deployment of tenant-specific datasets, parameters, and security roles through Power BI REST APIs or deployment pipelines can further streamline your solution. You are making strong progress, and these steps will support ongoing improvements in performance, security, and scalability.

 

Here are some useful resources for further exploration:

Incremental refresh for semantic models in Power BI - Power BI | Microsoft Learn

Advanced incremental refresh and real-time data with the XMLA endpoint in Power BI - Power BI | Micr...

Enhanced refresh with the Power BI REST API - Power BI | Microsoft Learn

Real-time streaming in Power BI - Power BI | Microsoft Learn


Regards,
Sreeteja.

 

 

lbendlin
Super User
Super User

1. Maybe use a budgeting/forecasting tool like Anaplan?

2. Push datasets are a dying breed.  Microsoft already tried to kill it once and they will try again. Also, they are live connections only.

3. It is not clear if/how/when Fabric will work with embedded scenarios

4. You can do manual partition management via XMLA, and the partition rules don't have to be time based.

Actually, we are a forecasting company ourselves, similar to Anaplan, and I had to build a robust multi-client architecture.

The other suggestions were helpful — especially the partitioning via XMLA endpoint. I’m going to look into that further.

Actually, we are a forecasting company ourselves, similar to Anaplan, and I had to build a robust multi-client architecture.

The other suggestions were helpful — especially the partitioning via XMLA endpoint. I’m going to look into that further.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.