Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello all,
I have a case where we have data processed and stored in BigQuery Google service and have to transpose it to Azure MySQL and connect it to PBI to allow several users to develop dashboards and read them.
What we chose to transfer data is to go from big query to google cloud storage and split the DataFrames in CSVs
Then from GCS to Azure for MySQL and UNION those CSV, the final result will be two DataFrames with approx 70M rows. (25 gigs of data in CSV), I believe a daily refresh will be more than enough.
Relative to this I have several questions:
- Is Azure MySQL a good choice of DB to tackle this problem or should I use another service: Azure SQL, Azure spark...?
- In terms of data load it will be impossible to work with the full dataset locally and direct query is not available on MySQL, is it possible to define the database structure, types.. in the web version, program incremental imports and work locally with only a portion of the dataset ? I remember having issue when trying to have the whole data loaded on the web and loading only a sample locally...
- Do you have an idea of the pricing plan we should go for to support this data load? The free PBI version only allows to store dashboards with 1 Gig of data I believe.
Thanks for your answers =D
L
Please le me know if I should provide more details or if the questions are unclear...
best,
User | Count |
---|---|
3 | |
3 | |
2 | |
2 | |
1 |
User | Count |
---|---|
9 | |
4 | |
4 | |
3 | |
3 |