Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
So I have a lot of files on a certain SP location, and the number keeps increasing every fortnight. I have to create ETL logic to ingest them into Power BI dataflow on PBI service. Of course you can assume they all have the same kind of data so I'll be fine appending these together.
For this, I first ingested a single file using Web API connection to the SP file, and did Power Query transformations on top of it. Then in the advanced editor I changed the file path and replaced it with a parameter. After that I converted the query to a function, called XYZ, so I can invoke it for all the files on the SP folder.
Then I created a new connection to SharePoint Folder. This way I got the list of all the files and their paths. I removed the irrelevant files and for the relevant ones I created a custom column that concatenates file paths and file names for all the relevant files. I then created an additional custom column and called my function XYZ over each filepath. Everything worked perfectly but I'm just not able to save my Dataflow.
The error I'm getting is: One of more tables references a dynamic data source.
Chatgpt asked me to change SharePoint.SharePoint.Files() to SharePoint.Contents() in my advanced editor, but then I'm getting authentication error: invalid credentials.
What can be done about it?
Solved! Go to Solution.
You need to follow Power BI's "combine binaries" pattern — similar to when you click “Combine Files” after connecting to a folder.
Here’s the clean way:
SharePoint.Files()
(Dataflow-safe)Connect to SharePoint folder (not a single file):
let
Source = SharePoint.Files("https://yourtenant.sharepoint.com/sites/yoursite", [ApiVersion = 15])
in
Source
Filter for the correct folder and file types (e.g. .csv
, .xlsx
, etc.):
Filtered = Table.SelectRows(Source, each Text.StartsWith([Folder Path], "https://yourtenant.sharepoint.com/sites/yoursite/Shared Documents/YourFolder/")
and Text.EndsWith([Name], ".csv"))
Use a single transformation function across all binaries:
Add a column: TransformFile = Table.TransformColumns(Filtered, {"Content", each MyTransformFunction(_)})
OR use the Combine Files GUI (it auto-creates a helper query TransformFile
behind the scenes — reuse it!)
Don’t build dynamic file paths manually
Let Power BI handle the content by passing [Content]
from the SharePoint.Files
table.
Example of a Valid Transform Function (static):
let
MyTransformFunction = (file as binary) =>
let
Source = Csv.Document(file, [Delimiter=",", Columns=5, Encoding=65001, QuoteStyle=QuoteStyle.None]),
Promoted = Table.PromoteHeaders(Source, [IgnoreErrors=true])
in
Promoted
in
MyTransformFunction
Then use this function like:
Transformed = Table.AddColumn(Filtered, "Data", each MyTransformFunction([Content]))
SharePoint.Contents()
SharePoint.Contents()
is a more granular function that sometimes solves dynamic path issues, but only for certain auth modes. If you got a “Invalid Credentials” error:
Try using Organizational Account when prompted
Ensure your tenant allows OAuth flows via SharePoint.Contents (some don't)
You may have to switch back to SharePoint.Files()
and apply the above technique instead
Did I answer your question? Mark my post as a solution! Appreciate your Kudos !!
Hi @SwaTHasSasIN ,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Chaithra.
You need to follow Power BI's "combine binaries" pattern — similar to when you click “Combine Files” after connecting to a folder.
Here’s the clean way:
SharePoint.Files()
(Dataflow-safe)Connect to SharePoint folder (not a single file):
let
Source = SharePoint.Files("https://yourtenant.sharepoint.com/sites/yoursite", [ApiVersion = 15])
in
Source
Filter for the correct folder and file types (e.g. .csv
, .xlsx
, etc.):
Filtered = Table.SelectRows(Source, each Text.StartsWith([Folder Path], "https://yourtenant.sharepoint.com/sites/yoursite/Shared Documents/YourFolder/")
and Text.EndsWith([Name], ".csv"))
Use a single transformation function across all binaries:
Add a column: TransformFile = Table.TransformColumns(Filtered, {"Content", each MyTransformFunction(_)})
OR use the Combine Files GUI (it auto-creates a helper query TransformFile
behind the scenes — reuse it!)
Don’t build dynamic file paths manually
Let Power BI handle the content by passing [Content]
from the SharePoint.Files
table.
Example of a Valid Transform Function (static):
let
MyTransformFunction = (file as binary) =>
let
Source = Csv.Document(file, [Delimiter=",", Columns=5, Encoding=65001, QuoteStyle=QuoteStyle.None]),
Promoted = Table.PromoteHeaders(Source, [IgnoreErrors=true])
in
Promoted
in
MyTransformFunction
Then use this function like:
Transformed = Table.AddColumn(Filtered, "Data", each MyTransformFunction([Content]))
SharePoint.Contents()
SharePoint.Contents()
is a more granular function that sometimes solves dynamic path issues, but only for certain auth modes. If you got a “Invalid Credentials” error:
Try using Organizational Account when prompted
Ensure your tenant allows OAuth flows via SharePoint.Contents (some don't)
You may have to switch back to SharePoint.Files()
and apply the above technique instead
Did I answer your question? Mark my post as a solution! Appreciate your Kudos !!
Check out the July 2025 Power BI update to learn about new features.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
35 | |
22 | |
21 | |
14 | |
13 |
User | Count |
---|---|
26 | |
19 | |
18 | |
18 | |
16 |