Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
So I have a lot of files on a certain SP location, and the number keeps increasing every fortnight. I have to create ETL logic to ingest them into Power BI dataflow on PBI service. Of course you can assume they all have the same kind of data so I'll be fine appending these together.
For this, I first ingested a single file using Web API connection to the SP file, and did Power Query transformations on top of it. Then in the advanced editor I changed the file path and replaced it with a parameter. After that I converted the query to a function, called XYZ, so I can invoke it for all the files on the SP folder.
Then I created a new connection to SharePoint Folder. This way I got the list of all the files and their paths. I removed the irrelevant files and for the relevant ones I created a custom column that concatenates file paths and file names for all the relevant files. I then created an additional custom column and called my function XYZ over each filepath. Everything worked perfectly but I'm just not able to save my Dataflow.
The error I'm getting is: One of more tables references a dynamic data source.
Chatgpt asked me to change SharePoint.SharePoint.Files() to SharePoint.Contents() in my advanced editor, but then I'm getting authentication error: invalid credentials.
What can be done about it?
Solved! Go to Solution.
You need to follow Power BI's "combine binaries" pattern — similar to when you click “Combine Files” after connecting to a folder.
Here’s the clean way:
SharePoint.Files() (Dataflow-safe)Connect to SharePoint folder (not a single file):
let
Source = SharePoint.Files("https://yourtenant.sharepoint.com/sites/yoursite", [ApiVersion = 15])
in
Source
Filter for the correct folder and file types (e.g. .csv, .xlsx, etc.):
Filtered = Table.SelectRows(Source, each Text.StartsWith([Folder Path], "https://yourtenant.sharepoint.com/sites/yoursite/Shared Documents/YourFolder/")
and Text.EndsWith([Name], ".csv"))
Use a single transformation function across all binaries:
Add a column: TransformFile = Table.TransformColumns(Filtered, {"Content", each MyTransformFunction(_)})
OR use the Combine Files GUI (it auto-creates a helper query TransformFile behind the scenes — reuse it!)
Don’t build dynamic file paths manually
Let Power BI handle the content by passing [Content] from the SharePoint.Files table.
Example of a Valid Transform Function (static):
let
MyTransformFunction = (file as binary) =>
let
Source = Csv.Document(file, [Delimiter=",", Columns=5, Encoding=65001, QuoteStyle=QuoteStyle.None]),
Promoted = Table.PromoteHeaders(Source, [IgnoreErrors=true])
in
Promoted
in
MyTransformFunction
Then use this function like:
Transformed = Table.AddColumn(Filtered, "Data", each MyTransformFunction([Content]))
SharePoint.Contents()SharePoint.Contents() is a more granular function that sometimes solves dynamic path issues, but only for certain auth modes. If you got a “Invalid Credentials” error:
Try using Organizational Account when prompted
Ensure your tenant allows OAuth flows via SharePoint.Contents (some don't)
You may have to switch back to SharePoint.Files() and apply the above technique instead
Did I answer your question? Mark my post as a solution! Appreciate your Kudos !!
Hi @SwaTHasSasIN ,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Chaithra.
You need to follow Power BI's "combine binaries" pattern — similar to when you click “Combine Files” after connecting to a folder.
Here’s the clean way:
SharePoint.Files() (Dataflow-safe)Connect to SharePoint folder (not a single file):
let
Source = SharePoint.Files("https://yourtenant.sharepoint.com/sites/yoursite", [ApiVersion = 15])
in
Source
Filter for the correct folder and file types (e.g. .csv, .xlsx, etc.):
Filtered = Table.SelectRows(Source, each Text.StartsWith([Folder Path], "https://yourtenant.sharepoint.com/sites/yoursite/Shared Documents/YourFolder/")
and Text.EndsWith([Name], ".csv"))
Use a single transformation function across all binaries:
Add a column: TransformFile = Table.TransformColumns(Filtered, {"Content", each MyTransformFunction(_)})
OR use the Combine Files GUI (it auto-creates a helper query TransformFile behind the scenes — reuse it!)
Don’t build dynamic file paths manually
Let Power BI handle the content by passing [Content] from the SharePoint.Files table.
Example of a Valid Transform Function (static):
let
MyTransformFunction = (file as binary) =>
let
Source = Csv.Document(file, [Delimiter=",", Columns=5, Encoding=65001, QuoteStyle=QuoteStyle.None]),
Promoted = Table.PromoteHeaders(Source, [IgnoreErrors=true])
in
Promoted
in
MyTransformFunction
Then use this function like:
Transformed = Table.AddColumn(Filtered, "Data", each MyTransformFunction([Content]))
SharePoint.Contents()SharePoint.Contents() is a more granular function that sometimes solves dynamic path issues, but only for certain auth modes. If you got a “Invalid Credentials” error:
Try using Organizational Account when prompted
Ensure your tenant allows OAuth flows via SharePoint.Contents (some don't)
You may have to switch back to SharePoint.Files() and apply the above technique instead
Did I answer your question? Mark my post as a solution! Appreciate your Kudos !!
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!