Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Good afternoon,
please help me with advice.
There is a report in Power BI, the source is data from Excel files from SharePoint.
There are more than 500 files, every day there is a new one with a new date in the name.
The client doesn't have the ability to store the data in a database, but the SharePoint option is not the most ideal for them either, because due to human error the files can be deleted or edited.
They want to use Fabric's capabilities to transform and store these files in a single table in LakeHouse.
Ideally, initially upload all Ecxel files to a table, delete them from SharePoint and then create a dataflow to append to the table the data from only one daily file.
The question is, is this possible and how can it be implemented in Fabric?
Any advice would be greatly appreciated.
Hi @Anonymous
Microsoft Fabric is certainly capable of handling a large volume of files and data.
I'd recommend that you go through the lakehouse and warehousing tutorials:
Tutorial name Scenario
| Lakehouse | In this tutorial, you ingest, transform, and load the data of a fictional retail company, Wide World Importers, into the lakehouse and analyze sales data across various dimensions. |
| Data warehouse | In this tutorial, you build an end-to-end data warehouse for the fictional Wide World Importers company. You ingest data into data warehouse, transform it using T-SQL and pipelines, run queries, and build reports. |
Hi @Anonymous
Why dont you first copy the files from SharePoint into the Lakehouse. Then once in the lakehouse you can use a dataflow gen2 (which is the same as Power Query) to append the data into an existing lakehouse table?
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!