Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
So complete newbie to the cloud coming from on premise MS stack.
Trying to put together a PoC end to end.
I have created a lakehouse.
So far I have dumped data from an external system to csv locally.
Plan is to use One lake file explorer to synch files to the cloud.
So now I have a folder with multiple csv files in my lakehouse
I want to use a notebook to read those files and dump them into parquet.
For the life of me I cannot find anything on how to loop the folder, tried os and glob, but I don't know what path to pass in.
Not sure this is the right approach, but the idea is I create a new folder each day of staging data csv files.
Somehow move it to parquet files, and compare the data from the day before to work out whats new and modified.
Then use dbt to transform data and finally load it to datawarehouse.
So back to load the path to loop in the notebook
Thanks
Solved! Go to Solution.
@jonjoseph if I understood correctly, you have this
Grab the ABFS path and utilize in the notebook
// Replace this with your actual folder path
val files = "abfss://workspace@onelake.dfs.fabric.microsoft.com/testLH2.Lakehouse/Files/DailyFiles"
// Read each CSV file in the folder
val df = spark.read.option("header", "true").csv(files)
.select("*", "_metadata.file_name","_metadata.file_modification_time")
display(df)
Copy file API path with os.listdir worked!
Hi @jonjoseph ,
Glad to know your issue got resolved. Please continue using Fabric Community for your further queries.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
56 | |
29 | |
18 | |
10 | |
4 |
User | Count |
---|---|
66 | |
54 | |
21 | |
8 | |
6 |