Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I use SQL View Query Datasets (Import option) and would like to know if there is a way to AFTER I have built a whole report with multiple visuals, in query measures / calculated columns to just replace the data table with another one (all headings are the same) basically just save as another report and replace with another sql view datatable and keep all the visuals/measures/calculated columns in tact ?
Also why is the size of the datasets such an issue ?? Is there a way to have unlimited import size on a dataset ?? Or how can I have huge datasets (without hassles
Solved! Go to Solution.
If you go into transform data, click the current table you can swap the M code in the advanced editor. Provided columns are the same visuals should maintain.
Dataset size refers to the size after it is in vertipac which has amazing compression. The whole thing however is loaded into memory at that point so the bigger it is the more resource it uses. If you're using a star schema power bi can handle vast data sets. Have a read of Data reduction techniques for Import modeling - Power BI | Microsoft Docs
Hi @NadiaLR1234
Have you solved this question with bcdobbs's help? If you have solved the question, you can accept the answer helpful as the solution or share you method and accept it as solution, thanks for your contribution to improve Power BI.
Also, I notice that you are new to forum, maybe there is something unfamiliar, let me take a screenshot to demonstrate,
If you need more help, please let me know.
Best Regards,
Community Support Team _Tang
If this post helps, please consider Accept it as the solution to help the other members find it more quickly.
If you go into transform data, click the current table you can swap the M code in the advanced editor. Provided columns are the same visuals should maintain.
Dataset size refers to the size after it is in vertipac which has amazing compression. The whole thing however is loaded into memory at that point so the bigger it is the more resource it uses. If you're using a star schema power bi can handle vast data sets. Have a read of Data reduction techniques for Import modeling - Power BI | Microsoft Docs
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.