Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
I'm building a data pipeline aimed at updating semantic models. I've watched a few videos and I'm really interested in using variable libraries. However, whenever I assign variables to the workspace and semantic model, I lose the "select all" option under the "table" setting.
Is there any workaround that would allow me to use these variables effectively? Or will I have to do this manually?
I'm using deployment pipelines and would really like to avoid having to go into the dev, QA, and prod workspaces manually to make these changes.
Thanks!
Solved! Go to Solution.
Hi @bdpr_95
When you assign deployment pipeline variables to control the workspace and semantic model connections, Power BI disables the “Select all” option under the table setting because the variable-based configuration replaces the static, pre-selected metadata binding. Essentially, variables introduce dynamic references that prevent Power BI from automatically resolving all table mappings at design time. Unfortunately, there’s currently no direct workaround to restore the “Select all” option when using variables. To manage this efficiently, you can either define the specific table mappings manually once per environment, or automate this process through the Power BI REST API or deployment pipeline API, which lets you script updates to dataset bindings and connections programmatically. While this adds a bit of setup, it ensures you can maintain a fully automated deployment process without having to adjust table mappings manually in each workspace.
Hi @bdpr_95
When you assign deployment pipeline variables to control the workspace and semantic model connections, Power BI disables the “Select all” option under the table setting because the variable-based configuration replaces the static, pre-selected metadata binding. Essentially, variables introduce dynamic references that prevent Power BI from automatically resolving all table mappings at design time. Unfortunately, there’s currently no direct workaround to restore the “Select all” option when using variables. To manage this efficiently, you can either define the specific table mappings manually once per environment, or automate this process through the Power BI REST API or deployment pipeline API, which lets you script updates to dataset bindings and connections programmatically. While this adds a bit of setup, it ensures you can maintain a fully automated deployment process without having to adjust table mappings manually in each workspace.
Hi @bdpr_95 ,
You’ve got it right. Right now, when you assign variables from a variable library to a workspace or semantic model, the “Select all” option in the Semantic Model Refresh Template disappears. That’s just how it works at the moment nothing wrong with your setup. It’s a limitation since the feature’s still in preview. Once a variable’s applied, the setup gets dynamic, so the system can’t show or auto-select all tables like in a static setup.
For now, you’ll need to manually pick which tables to include in the refresh or set them up when creating the template. If you’re looking to streamline changes across Dev, QA, and Prod, variable libraries with deployment pipeline rules or parameters can still help. And if you want full automation, you can handle refresh configs through the Fabric REST API, which doesn’t have this issue. Microsoft knows this impacts workflows, so improvements are on the way as the feature develops.
Thank you,
Tejaswi.
@v-tejrama , is there any possibility of creating a notebook that refreshes the semantic models one by one instead of updating all of them at the same time? The issue is that I have semantic models that need to be refreshed daily and others that only need to be refreshed monthly. So I always have to filter them first and then update the semantic models. Also, I’d like to refresh one semantic model at a time, not all simultaneously. And if possible, I’d like to receive some kind of notification via Teams or email if the process for semantic model X fails.
Hi @bdpr_95 ,
You can definitely achieve that by creating a Fabric notebook that triggers the semantic model refreshes one at a time instead of running them all in parallel. Within the notebook, you can call the Fabric REST API to refresh each semantic model individually, and include simple logic to determine which ones should run based on your schedule so your daily models can refresh more often, while the monthly ones only run when needed. By looping through your model list and waiting for one refresh to complete before starting the next, you can fully control the sequence and avoid overlapping refreshes.
If you’d like to be notified when something goes wrong or when a refresh finishes, you can extend the notebook to send alerts through Microsoft Teams or email. This can be done using a webhook or the Microsoft Graph API, which lets you post a message or send an automated email when a refresh fails or succeeds. This approach gives you a clean and reliable way to manage refreshes with different frequencies and still stay informed without needing to monitor them manually.
Best Regards,
Tejaswi.
Community Support
Hi @bdpr_95 ,
You can address this by setting up a Fabric notebook that uses the REST API to refresh your semantic models individually, rather than all at once. Within the notebook, you have the flexibility to list your models and define which should be refreshed daily or monthly, applying logic to run each refresh sequentially. This approach helps prevent overlap and potential performance issues.
For notifications, you may include a step in the notebook to send alerts when a refresh completes or encounters an error. Common options include a Microsoft Teams webhook or the Microsoft Graph API for automated email notifications. This ensures you receive timely updates on the status of each model refresh.
After validating the notebook’s functionality, you can schedule it using a Fabric data pipeline. This automates the refresh process for your daily and monthly models, keeping you informed without the need for manual intervention.
Thank you.
@v-tejrama thanks a lot for the detailed explanation! That sounds like a great approach. Could you please share an example of how the notebook code would look? For instance, I have reports A, B, and C that should be refreshed daily, and reports D, E, and F that are refreshed monthly. It would be super helpful to see how that logic could be implemented. Thanks again!
Hi @bdpr_95 ,
Can you take a look at my solution
Here's the sample code:
fabric.refresh_dataset(
workspace = 'testWorkspace',
dataset = 'testdataset',
refresh_type = 'full',
apply_refresh_policy = 'false'
)
Thank you.
Hi @bdpr_95 ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions.
Thank you.
Hi @bdpr_95 ,
Just checking in have you been able to resolve this issue? If so, it would be greatly appreciated if you could mark the most helpful reply accordingly. This helps other community members quickly find relevant solutions.
Thank you.
Hi @bdpr_95 ,
Is your goal to use the variable library to change the data source dynamically as the semantic model moves through stages of the deployment pipeline? If so, another effective solution would be to use parameters on the semantic model and leverage deployment or parameter rules in the deployment pipeline to reassign the data source.
Here is some official documentation from Microsoft on how this is implemented: https://learn.microsoft.com/en-us/fabric/cicd/deployment-pipelines/create-rules?tabs=new-ui
If this helped, please mark it as the solution so others can benefit too. And if you found it useful, kudos are always appreciated.
Thanks,
Samson
Connect with me on LinkedIn
Check out my Blog
No, I'm talking about this: Semantic Model Refresh Templates in Power BI (Preview) | Microsoft Power BI Blog | Microsoft Power B...
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.
| User | Count |
|---|---|
| 51 | |
| 23 | |
| 11 | |
| 11 | |
| 11 |