Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hi Folks, I have a Data Model hosted in PowerBI Premium Capacity and my fact tables have more then 200M rows. For performance, we have copied our data to Google BigQuery. As I'm using the large model on PowerBI I can't download the pbix file to manage it. Is there a way with some external tool (Like Tabular Editor) to move my data sorce from SQL SERVER to Google Bigquery without recreate/reweite my entire model?
Regards!
200M is not large. Set it back to small model and then fetch the pbix.
Large Model starts around the 5GB mark.
Are you saying that you created the dataset in the Power BI service? Was there no original PBIX?
@lbendlin Yes, there's no original pbix, it is from Azure Analysis Services Migration. I've publused the .BIM file using the Tabular Editor.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 5 | |
| 3 | |
| 3 | |
| 3 | |
| 2 |