Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hi there!
I've got a large dataset that is somewhat slow to refresh and my team would like the data updated every 30 minutes. I've been thinking about how I can cache the stale data in one dataset in a PBI capacity workspace which will be refreshed daily. Then create another dataset for the current data which will be refreshed every 30 minutes. I'm struggling with the next step: how do I union the two datasets together in PBI? Is it possible?
Solved! Go to Solution.
Hi @nbarsley Could you follow these please
Publish both stale (daily) and current (30-minute) datasets to the same Power BI workspace.
In Power BI Desktop, use Get Data > Power BI Datasets to connect to both datasets. Build a composite model by querying both.
Create a calculated table using DAX:
Use the CombinedData table to build your report visuals.
Publish the combined model to the Power BI Service for dynamic updates.
If this post helped please do give a kudos and accept this as a solution
Thanks In Adavnce
Have tried implementing Incremental refresh ?
https://learn.microsoft.com/en-us/power-bi/connect-data/incremental-refresh-overview
This will help achieving your requirement
Hi @nbarsley,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the community members for the issue worked. If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Thanks and regards
Hi @nbarsley,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If our responses has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @nbarsley,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Have tried implementing Incremental refresh ?
https://learn.microsoft.com/en-us/power-bi/connect-data/incremental-refresh-overview
This will help achieving your requirement
Hi @nbarsley Could you follow these please
Publish both stale (daily) and current (30-minute) datasets to the same Power BI workspace.
In Power BI Desktop, use Get Data > Power BI Datasets to connect to both datasets. Build a composite model by querying both.
Create a calculated table using DAX:
Use the CombinedData table to build your report visuals.
Publish the combined model to the Power BI Service for dynamic updates.
If this post helped please do give a kudos and accept this as a solution
Thanks In Adavnce
Can you confirm that this works with more than 1,000,000 rows? It doesnt work for me.
Hi @nbarsley ,
Thanks for reaching out to the Microsoft fabric community forum.
It is not dependent on rows but instead on the storage size of semantic models.
The maximum size for semantic models imported into the Power BI service is 1 GB. These semantic models are heavily compressed to ensure high performance. In addition, in shared capacity, the service places a limit of 10 GB on the amount of uncompressed data that is processed during refresh. This limit accounts for the compression, and therefore is higher than the 1-GB maximum semantic model size. Semantic models in Power BI Premium aren't subject to these limits. If refresh in the Power BI service fails for this reason, reduce the amount of data being imported to Power BI and try again.
Source: Troubleshoot refresh scenarios - Power BI | Microsoft Learn
If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS.
Thanks and Regards
Thanks,
I did try this earlier but received this error message:
Something went wrong
The resultset of a query to external data source has exceeded the maximum allowed size of '1000000' rows.