Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hi,
Is there any way to get the llist of parquet files that are being used for the current Delta Table version loaded in a notebook? I have a daily overwrited Delta Table, so I have one parquet file by day. I guess the last version is only using last parquet file because I overwrite full table. How can I know programatically the name of that parquet file?
Solved! Go to Solution.
I've reached out the solution.
df.inputFiles()
Hi @amaaiia ,
Thanks for using Fabric Community.
In order to provide some suggestions would like to know why are you trying to find the parquet file? Any thing you are trying to do?
If you are looking for optimization - Delta Lake table optimization and V-Order - Microsoft Fabric | Microsoft Learn
If you are looking for versions of the table, you can get the version history.
Docs to refer -
Data versioning using Time Travel feature in Delta Lake | by Prachi Kushwah | Medium
Work with Delta Lake table history | Databricks on AWS
Hope this is helpful. Please let me know you use case if possible, I will try to guide you better.
I'm looking for parquet files because I'm having issues while reading same table from SQL Endpoint and from notebook, different data is shown from each one. So, for testing purposes, I want to check which parquet files are being read from notebook.
Hi @amaaiia ,
Docs - Microsoft Spark Utilities (MSSparkUtils) for Fabric - Microsoft Fabric | Microsoft Learn
From Lakehouse -
Inorder to load the parquet file -
Can you check whether this works for you?
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.
| User | Count |
|---|---|
| 15 | |
| 6 | |
| 3 | |
| 2 | |
| 2 |