Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
My company is migrating files from one system to another. One of the requirements for migration is that every machine be scanned for malware. Each of these scans produces a log file. My job is to parse the log files and present the aggregate data.
I've written an M script that parses the files into 3 tables: runTable, runErrorsTable, runDetectionsTable. The 2nd two tables are many to one with the runTable.
And I've created a report that does what the boss needs. But I'm concerned about scalability. There are thousands of these log files, with more created daily. The files are anywhere from 10 kb to ~2Mb.
Since parsing these files is not trivial (they have multiple sections in various formats), does it make more sense to pre-parse them into a set of Dataverse tables and let Power BI use that as a datasource? Or should I just let Power BI deal with both parsing and presentation?
Can you convert the files into Parquet / Delta Lake format?
Delta Lake table optimization and V-Order - Microsoft Fabric | Microsoft Learn
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!