Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hello,
We are testing Power BI to see what details it can provide about folder/subfolder/files.
So, pretty simply, pointing to a folder (in Get Data) and trying to get info about the contents of those folders. We do not need to interrogate the contents of those underlying files. Just need to know attributes/metadata about the file - name, folder, extension, size, last time accessed, etc.
This is extremely slow and takes several hours to run. All we want is the metadata/properties of each file. Is there a setting or way to do this so it runs quicker? We do have ~40k underlying files but it took ~6 hours to run and load. This was just one folder structure.
Thanks for any help you can provided,
Dan
Solved! Go to Solution.
Use PowerShell, this isn't really a use case Power BI was designed for. I've written multi-threaded PowerShell to scan folder structures and that's a much better tool for the job. User PowerShell to generate a CSV file of the metadata and import that into Power BI. You are trying to use a miter saw to hammer a nail.
Use PowerShell, this isn't really a use case Power BI was designed for. I've written multi-threaded PowerShell to scan folder structures and that's a much better tool for the job. User PowerShell to generate a CSV file of the metadata and import that into Power BI. You are trying to use a miter saw to hammer a nail.
@Greg_Deckler Do you mean using PowerShell to scan Excel files and convert into metadata CSV files?
Because I have the exact issue. Huge Excel files as data, updating monthly, and powerbi refreshes all the data sources from the beginning which took forever