Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hi Folks, I have a Data Model hosted in PowerBI Premium Capacity and my fact tables have more then 200M rows. For performance, we have copied our data to Google BigQuery. As I'm using the large model on PowerBI I can't download the pbix file to manage it. Is there a way with some external tool (Like Tabular Editor) to move my data sorce from SQL SERVER to Google Bigquery without recreate/reweite my entire model?
Regards!
200M is not large. Set it back to small model and then fetch the pbix.
Large Model starts around the 5GB mark.
Are you saying that you created the dataset in the Power BI service? Was there no original PBIX?
@lbendlin Yes, there's no original pbix, it is from Azure Analysis Services Migration. I've publused the .BIM file using the Tabular Editor.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
15 | |
10 | |
10 | |
8 | |
7 |