Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
The new Parquet connector is fine, however a common scenario is using Scala/spark/bricks ect in the production on the data files. In all cases that I'm familiar with, files are written per partition with guids.
In my specific case, because this is cooked data for dashboards, I can keep it to a single partition on an ADLS Gen2 Blob storage. However the GUID will change with each set of daily data. And then the refresh will 404 on the subsequent refreshes.
How can I have PBI read a directory of parquet files, or supply wild signs in the path?
Here is the query I use
let
Source = AzureStorage.Blobs("https://mystorageaccount.blob.core.windows.net"),
AuditLogs = Source{[Name="logsfolder"]}[Data],
LogContent = Table.Column(AuditLogs,"Content"),
CombineFiles = List.Transform(LogContent , (row) => Parquet.Document(row)),
LogTable = Table.Combine(CombineFiles)
in
LogTable
Hi,
I retrieve a Parquet file from a URL. If you need to do the same, you can follow these instructions:
Click on Transform Data.
Create a Blank Query.
Enter the following code in the query editor:let
// URL of the Parquet file
Source = Binary.Buffer(Web.Contents(""YOUR URL"")),
// Load the Parquet file into a table
Table = Parquet.Document(Source)
in
Table
Replace "YOUR URL" with the actual URL of your Parquet file.
Thank you.
Did you found the solution?
I'm guessing you're referring to ADLS Gen2 File Share? Or is there another connector I'm not seeing. The Local 'folder' option only seems to all local directories