Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hi Folks, I have a Data Model hosted in PowerBI Premium Capacity and my fact tables have more then 200M rows. For performance, we have copied our data to Google BigQuery. As I'm using the large model on PowerBI I can't download the pbix file to manage it. Is there a way with some external tool (Like Tabular Editor) to move my data sorce from SQL SERVER to Google Bigquery without recreate/reweite my entire model?
Regards!
200M is not large. Set it back to small model and then fetch the pbix.
Large Model starts around the 5GB mark.
Are you saying that you created the dataset in the Power BI service? Was there no original PBIX?
@lbendlin Yes, there's no original pbix, it is from Azure Analysis Services Migration. I've publused the .BIM file using the Tabular Editor.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
13 | |
13 | |
11 | |
8 | |
8 |
User | Count |
---|---|
17 | |
10 | |
7 | |
7 | |
7 |