Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
This is on Power BI desktop...
I have 40 million rows (and counting) in a MySQL database and one column is a JSON package which has to be parsed. This isn't an issue, but it does take a while to load. (it takes about 2 minutes per million records to load - about 1.5 hours now and it keeps growing)
I'd like to just add the rows that are new, but I can't figure out how to just add for id > 40000000 without getting rid of all the rows with id < 40m. Is there a way to do this?
I'm pondering a workaround by adding separate queries for each month (or chunk of id numbers) and then associate them, but I fear I'll have to work harder to get it all working as one dataset like it should.
Can I do this easily and if not, is there a hard but effective way?
Solved! Go to Solution.
There are some suggestions for you.
a. Write specific SQL query to import required columns and rows from MySQL database to Power BI Desktop.
b. During the import process, disable the following highlighted options in Power BI Desktop.
c. Create views for different date time in MySQL database if necessary, and then import the views in Power BI Desktop.
Regards,
Lydia
There are some suggestions for you.
a. Write specific SQL query to import required columns and rows from MySQL database to Power BI Desktop.
b. During the import process, disable the following highlighted options in Power BI Desktop.
c. Create views for different date time in MySQL database if necessary, and then import the views in Power BI Desktop.
Regards,
Lydia
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!