Fabric is Generally Available. Browse Fabric Presentations. Work towards your Fabric certification with the Cloud Skills Challenge.
Hi,
I am coming across an issue of very slow refreshing and data transformation. Here is my problem:
- I have a large table (about 4 million rows for the last 3 years) which will be appended to the dynamic table (growing each year). I have two options:
1. Exporting the old large data to CSV and keep it in on-premise and then refreshing the power bi by reading that every time.
2. keeping this big file in Power Bi and append it to the new data.
Which way is more efficient?
I appreciate your time
Alireza
Solved! Go to Solution.
For #2, you will need to refresh your data to append it anyway. You could use UNION and a DAX table to avoid the refresh, but I don't recommend that as it will bloat your file size unnecessarily (and it is a bad practice). Do you have the option to use incremental refresh? If so, I would do that. If not, I would store your csv on SharePoint or OneDrive to refresh without need of a gateway.
Regards,
Pat
To learn more about Power BI, follow me on Twitter or subscribe on YouTube.
For #2, you will need to refresh your data to append it anyway. You could use UNION and a DAX table to avoid the refresh, but I don't recommend that as it will bloat your file size unnecessarily (and it is a bad practice). Do you have the option to use incremental refresh? If so, I would do that. If not, I would store your csv on SharePoint or OneDrive to refresh without need of a gateway.
Regards,
Pat
To learn more about Power BI, follow me on Twitter or subscribe on YouTube.
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.