Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
PowerBI_Mallow
Frequent Visitor

Append query for two datasource

We have three years of business data now, recent 2 years of data will always in the mysql database and other years will be stored in sharepoint (currently 1 year data) on the next year in sharepoint there will be 2 years of data and in mysql database there will be 2 years of data, it will be continuing.

No we want to append both the data from mysql database and sharepoint. I would like to import sharepoint and incrementally refresh the data from mysql database is it possible. 

please provide your suggestion on this.

5 REPLIES 5
Anonymous
Not applicable

Hi @PowerBI_Mallow 

 

60 GB data is huge, it's not recommended to load all of them into Power BI Desktop. And it may hit some limitations.

 

Firstly if you use Import, there is a 1 GB model size limitation when you publish a report to a shared capacity in Power BI Service. For premium capacities, the model size can be larger depending on the SKU. For further information, read the Power BI Premium support for large semantic models article. (Semantic models were previously known as datasets.) Import models are loaded with data that is compressed and optimized and then stored to disk by the VertiPaq storage engine. When source data is loaded into memory, it's possible to see 10x compression, and so it's reasonable to expect that 10 GB of source data can compress to about 1 GB in size. However, 10 GB is still far smaller than the 60 GB you have.

 

Secondly there is a refresh time limitation when refreshing data in Power BI Service. Power BI Pro models (in shared capacities) have a refresh time limit of two hours. For models in a Premium capacity, the time limit is five hours. So refreshing a large data model may hit the refresh time limitation and fail. 

 

If you want to use these data to make reports, you won't display all data in the report right? You are always aggregating them and then displaying them in visuals, right? Reducing the data size to be loaded into the model as much as possible is always suggested first. This can improve data refreshing and report rendering performance greatly. Data reduction techniques for Import modeling - Power BI | Microsoft Learn You can also pre-aggregate data on the data source side and then load the aggregated data into Power BI Desktop. 

 

If the data size is still large after reduction, to improve refresh performance, you can use XMLA endpoint to bypass the refresh time limit in Power BI Service. This is a feature available in premium capacities. Read Advanced incremental refresh and real-time data with the XMLA endpoint in Power BI - Power BI | Micr...

 

In addition, if you want to use incremental refresh, the data in the table should be from the same data source. So appending two queries from different data sources is not a good idea. 

 

In addition, if you have interest, you can read an old thread What is the best way to handle 50GB dataset in Power BI?

 

Hope this helps. 

 

Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Highly appreciate your Kudos!

PowerBI_Mallow
Frequent Visitor

@DataVitalizer , Thanks for the inputs,
As my table size is huge 60GB, is it advisable to use dax function here. Also, i would like to know incremental refresh won't work for append queries in power query in my case?

Hi @PowerBI_Mallow 
For such size you wouldn't need to import your table but instead you will connect it using Direct query mode.
I come back to you now with an updated scenario, lets say A=Sharepoint, B=mysql and C=merged table:

  1. Append both tables to a new query in Power Query
  2. Apply the incemental refresh on table A 
  3. Disable loading both tables A and B
  4. You will end up in Report view with one Table C


Did it work 👍 A kudos would be appreciated ‌‌📢 Mark it as a solution to help spreading knowledge

@DataVitalizer ,

we are not able to use direct query due to its limiatation & we prefer import mode.

What is the difference of append the table A(sharepoint) to table(mysql) instead of append as new. 

Also in your suggested method do configure incremental refresh for table C? In that case whether the query will be sent to both sharepoint source and mysql?

DataVitalizer
Solution Sage
Solution Sage

Hi @PowerBI_Mallow 

Here is a quick answer on the going, Power BI supports your need, you can import each table aside, but since you want to apply incremental refresh I guess it would be better combining them using DAX (UNION function) rather than using Power Query.

Did it work 👍 A kudos would be appreciated ‌‌📢 Mark it as a solution to help spreading knowledge

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.