Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Good Morning
I currently run a dashbaord based on a daily excell report that is sent from another team within the company. The excel document is aound 9MB which alone is insiginifcant but now that we are hitting the 18 month mark, the daily refresh/import is taking more time than i would like. The old files never change, another file is simply added to a folder/subfolders that PowerBi looks at and pulls whats in it, in to the report.
Is it possible to keep the data already imported in to the dashboard and simply add the latest data when i click refresh?
Having looked online i see there are incrmental refreshes but from what i can see these are purely based on date ranges within a single table and not multiple files within a folder.
This may not be possible but if it is, your help would be appreciated.
Thank you
Hi @Memphis28 ,
It is actually possible to do incremental refresh with sharepoint files. The increments is based on the file level date and not the dates within the file - https://www.youtube.com/watch?v=IVMdg16yBKE
Alternatively and since those files will no longer change, I would:
And then just combine both historical and current in Power BI Desktop.
Thank you, that video i think will solve my issues. I have awatched it a couple of times and i think it makes sense. Hopefully i can put it in to practice.
Just referencing you alternative resolution, what is the ETL and what do you mean by connect to the dataflow instead. Although i have a year of PowerBi experience under my belt there is a lot i don't know as never needed to so i will apologide in advacne if this is basic knowledge i have missed.
Thanks again
The ETL is just the transformation you've applied to the files in the folder, say, for up to Feb this year. You can use DAX studio to extract the output of that as csv. Establish a connection to that file then append that to the query of the current files. Your current files should start getting data from files after Feb. This should be faster as power query will need to apply further transformation only to a small number of files or maybe just one file.
Or you can move the query of the historical files to Dataflow. It is the same query but filtered to just the historical data. Dataflow will refresh just once which output is stored in Azure. You can use then connect to that dataflow in desktop and combine it with the query of the current files. With dataflow, you are only connecting its ouput after it's applied its own transformations.
User | Count |
---|---|
84 | |
76 | |
74 | |
49 | |
39 |
User | Count |
---|---|
114 | |
56 | |
51 | |
42 | |
42 |