Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
I would like to use the Notebook to run the same script for all files stored in the Lakehouse files. How can this be done? If there´s a need to get the storage account key for lakehouse files, where can this be obtained?
Solved! Go to Solution.
Thanks! I'm sure this works and almost worked for me, but for some reason I keep getting this error Spark_Ambiguous_MsSparkUtils_UseMountedPathFailure.
I'll keep checking.
For me it only works when i use the /lakehouse/default/Files... Path.
However when i try to use it with the abfs path, do get the following Error:
FileNotFoundError: [Errno 2] No such file or directory: 'abfss://.../input'
file_path = f"abfss://.../input"
lst = os.listdir(file_path)
lst
Any idea what causes that issue?
UPDATE: It worked now, thanks again! There were a couple of problems: the delta tables I was creating had blank spaces in their names. Also, when creating the delta tables, I changed to use the qualified path, instead of the relative path.
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.