Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hello everyone,
We are trying to automate the refresh of specific tables (about 28) within a Power BI dataset published to a Pro workspace (hosted in a Premium Capacity area, but our users have Pro licenses only).
We want to refresh these tables one by one, in a specific order, and track success (or failure) after each step, without manually updating them through the Power BI Desktop UI.
Here’s what we’ve already tried:
We can open the .pbix locally and run a C# script.
However, RequestRefresh() and SaveChanges() methods aren't available when connected in import mode.
These methods fail silently or throw errors depending on context.
We connected directly to the semantic model using the XMLA Endpoint.
Still, we encounter issues: tableCollection.Find() and Model.SaveChanges() don’t work as expected — maybe due to permission/context issues?
The environment doesn't seem to expose all TOM capabilities.
We attempted using fabric.refresh_dataset() and refresh_selected_tables() scripts from Fabric Guru.
The script fails due to missing tokens, likely because the authentication flow expects Notebook context or extra libraries (synapse, trident_token_library_wrapper, etc.).
We cannot execute it locally as a standalone script.
Although the dataset is hosted in the Service, we are open to Desktop-based solutions (e.g., using Tabular Editor 2 with scripting) — as long as the refresh of individual tables in sequence can be automated without UI interaction.
We’d greatly appreciate help from the community or Microsoft team.
Is there a supported or semi-supported way to do this with Pro licensing and XMLA write access enabled?
Thanks in advance!
Solved! Go to Solution.
Hi @paulos
You could use Power Automate or a Fabric Notebook to refresh each table in the order you want. You can specificy the table name that you want to refresh, then monitor the refresh of the table and only once that has completed would you then move into the next table.
Here is an example how to do this using Power Automate: Leveraging the Advanced Refresh API using Power Automate - minceddata - Obsidian Publish
To do it in more detail with the monitoring you need would be a consultant who can do this work for you.
Hi @paulos,
I hope you had a chance to review the solution shared By @GilbertQ @Poojara_D12 . If it addressed your question, please consider accepting it as the solution — it helps others find answers more quickly.
If you're still facing the issue, feel free to reply, and we’ll be happy to assist further.
Thank you.
I don't think I will mark any of the answers as the solution. I've seen the link GilbertQ shared and although it seems interesting I'm surprised at the complexity we would have to deal with just to do such a simple task as it is to partial refresh a dataset. I would have to see the answers in more detail but I don't think I will mark them as solutions. You tell me if it's mandatory to do it.
Hi @paulos,
Please check the response from @Poojara_D12 as it addresses your scenario. He explains that XMLA/TOM and sempy-based methods may not work as expected due to token and permission limitations in Pro environments.Also suggests using the Enhanced Refresh API with a Service Principal, which supports table-level (partition) refresh in Premium workspaces.
Thank you.
I always feel kind of pressured to accept as solution things I don't consider as solutions. I don't think that's a fair way to portray to the world that something is really solved but sure, I will accept the first one. Both of them are helpful although for me either really are what I was looking for but I appreciate the information.
Hi @paulos
As far as I am aware you can use a fabric notebook to connect to semantic models in pro workspaces, so this can be done.
Hi @paulos
You're aiming to automate the sequential refresh of around 28 specific tables within a Power BI dataset that's hosted in a Premium Capacity workspace, although your users only have Power BI Pro licenses. While you've explored several paths—including scripting with Tabular Editor 2 and 3, using XMLA endpoint access, and trying Microsoft's semantic-link Python library (formerly sempy)—each approach has run into roadblocks due to limitations in permissions, unsupported commands in import mode, or incomplete token-based authentication when not running in a Fabric or Notebook environment.
To clarify the core issue: although Premium Capacity workspaces allow XMLA read/write access, most Pro-licensed users can't take full advantage of XMLA write operations like RequestRefresh() or Model.SaveChanges() outside the supported tooling or automation contexts. Power BI Desktop doesn't expose the refresh-by-table functionality for scripting, and Tabular Editor 2 lacks the necessary TOM support in import models unless the dataset is opened in a way that grants elevated metadata control. Even Tabular Editor 3, while more powerful, is limited by service-enforced security boundaries.
Unfortunately, there's currently no fully supported method under Pro licensing to automate individual table refreshes in a published dataset using only XMLA or scripting without leveraging Fabric Notebooks or the enhanced token environments they provide. Your options are likely narrowed to:
Using Power BI REST API to trigger full or partial dataset refreshes — though it cannot refresh individual tables directly. You can use Enhanced Refresh (in Premium) if you can generate a valid access token and scope the refresh to partitions (but this typically requires a service principal or Azure AD app with workspace access).
Deploying a service principal with service account access and running enhanced-refresh REST API calls, which can support per-table or per-partition refresh in Premium (not in shared capacity). This requires setting up an Azure AD app and registering it with tenant-wide admin consent.
Using Tabular Editor 3 or TOM scripts in a fully supported context (e.g., through Azure Automation or a local VM with elevated access and correct token flow), which still requires a Premium workspace and write access via service principal or a user with appropriate rights.
In conclusion, while you're close with your current approaches, the missing piece is authenticated, token-based access through a supported automation context (like Azure Functions or Fabric Notebooks). You might consider raising a request with your Power BI administrator to enable service principal authentication and enhanced refresh support, which will allow you to use the REST API or TOM-based tools to control refreshes at the table/partition level as you intended.
Hi @paulos
You could use Power Automate or a Fabric Notebook to refresh each table in the order you want. You can specificy the table name that you want to refresh, then monitor the refresh of the table and only once that has completed would you then move into the next table.
Here is an example how to do this using Power Automate: Leveraging the Advanced Refresh API using Power Automate - minceddata - Obsidian Publish
To do it in more detail with the monitoring you need would be a consultant who can do this work for you.
User | Count |
---|---|
46 | |
32 | |
30 | |
27 | |
25 |
User | Count |
---|---|
55 | |
55 | |
35 | |
33 | |
28 |