Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Simon_Striegel
Regular Visitor

Large database: customize import?

Dear community!

I am dealing with a relatively large MySQL database (15-25 GB) and would like to avoid that PowerBI imports the entire database. Is there any way to customize/define the data that PowerBI imports via "Get Data", e.g. by defining only the most recent database entries (by timestamp)?

I would be very grateful for an answer! Thanks in advance!

Best wishes

Simon

 
2 REPLIES 2
v-lili6-msft
Community Support
Community Support

hi, @Simon_Striegel 

Yes, you could get it in Edit Queries, please refer to this similar post:

https://community.powerbi.com/t5/Desktop/Filter-query-by-date-column/td-p/333186

https://blog.crossjoin.co.uk/2018/01/08/in-the-previous-date-filters-in-power-bi-gettransform-power-...

 

Best Regards,

Lin

Community Support Team _ Lin
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Thank you for your suggestion. Unfortunately, correctly filtering data based on current time is not really the problem with which I'm grappling.
Perhaps a more lucid description of my problem is needed.

The data I wish to analyse using PowerBI is in a SQL database and includes some *very large* tables -- several 100 million rows and tens of GB -- with timestamp as the primary key. These data are generated by the application and there already are processes in place to discard "aged" data from these tables.

The PowerBI analysis will *always* have a time bracket on the data, ranging from the past hours to the past days/weeks -- i.e., over a much smaller subset of the table that the SQL database can easily/quickly provide via primary key.

When I "Get Data" in PowerBI, I would like to limit the amount of data loaded to, say, the past 12 hours (and not all the several 100 million rows). I would appreciate any suggestions on how best to do achieve this.

Kind regards,

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors