Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I’m getting a credential error in my Power BI Dataflow after adding custom functions that reference a SharePoint folder.
The credentials provided for the SharePoint source are invalid.
I already verified:
• The SharePoint site URL is correct
• I have access to the SharePoint library
• The same SharePoint connection works in other queries
• Parameter and sample file references are correct
But once I connect my parameter, sample file, and transform file functions, the dataflow fails with the credential error above and I'm using these parameters in a table
Here is the structure I’m using:
- ISF Reports New
- SharePoint file path
- Reads sample workbook
- Custom function to transform each file
- Invokes the function on all files in the folder
Has anyone faced this issue before?
Is this caused by:
• Using Binary parameter for SharePoint?
• A function referencing a different authentication scope?
• A mismatch between the Dataflow’s linked connection and the function source?
What is the correct way to configure SharePoint credentials so all custom functions authenticate properly?
Hi, @dragonhood
The "Invalid Credentials" error in Dataflows often happens because the Power Query engine in the Service struggles to validate credentials inside a separate "Custom Function" object.
The most robust way to fix this is to move the transformation logic directly into your main query (the "Inline" approach). This bypasses the need for the Invoke Custom Function step entirely.
Here is the solution based on your data structure:
Step 1: Create your Parameters In your Dataflow, create two text parameters to handle the connection dynamically:
o Type: Text
o Current Value: https://yourdomain.sharepoint.com/sites/YourSite
o (Enter the root URL of your SharePoint site here)
o Type: Text
o Current Value: Shared Documents/YourFolder/
o (Note: Copy the exact value from the "Folder Path" column in the preview to ensure it matches)
Step 2: Use this M Code Create a blank query and paste the code below. This script connects, filters the specific folder using the parameters above, extracts the "Charts" sheet from the Excel files, and performs the cleanup/splitting logic without calling external functions.
let
// 1. Load SharePoint files using the parameter
Source = SharePoint.Files(SharepointSite, [ApiVersion = 15]),
// 2. Keep only the required folder
FilteredRows = Table.SelectRows(Source, each [Folder Path] = FolderPath),
// 3. Exclude hidden files
VisibleFiles = Table.SelectRows(FilteredRows, each [Attributes]?[Hidden]? <> true),
// 4. Read Excel content directly (Inline transformation)
ExcelContent = Table.AddColumn(VisibleFiles, "Excel", each Excel.Workbook([Content], null, true)),
// 5. Extract specific sheet "Charts"
ExtractCharts = Table.AddColumn(
ExcelContent,
"ChartsSheet",
each Table.SelectRows([Excel], each [Item] = "Charts" and [Kind] = "Sheet"){0}[Data]
),
// 6. Promote headers inside each nested table
PromoteHeaders = Table.TransformColumns(
ExtractCharts,
{"ChartsSheet", each Table.PromoteHeaders(_, [PromoteAllScalars = true])}
),
// 7. Rename file column to keep source traceability
RenameSource = Table.RenameColumns(PromoteHeaders, {{"Name", "SourceName"}}),
// 8. Expand data from the nested tables
ExpandedData = Table.ExpandTableColumn(
RenameSource,
"ChartsSheet",
Table.ColumnNames(PromoteHeaders{0}[ChartsSheet])
),
// 9. Set Standard types
ChangeType = Table.TransformColumnTypes(
ExpandedData,
{
{"SourceName", type text},
{"Period", type date},
{"Total ISFs Filed", Int64.Type},
{"Total On Time ISFs", Int64.Type},
{"Total Late ISFs", Int64.Type},
{"Percentage On Time", Percentage.Type},
{"Percentage Late", Percentage.Type}
}
),
// 10. Remove unneeded columns
RemoveExtraCols = Table.RemoveColumns(ChangeType, {"Column7", "Period_id"}),
// 11. Filter only valid rows
FilterValid = Table.SelectRows(
RemoveExtraCols,
each ([Percentage On Time] <> null and [Percentage Late] <> null)
),
// 12. Duplicate SourceName for split logic
DuplicateSource = Table.DuplicateColumn(FilterValid, "SourceName", "SourceName_Copy1"),
// 13. Split importer info (Transition from Text to Number)
SplitImporter = Table.SplitColumn(
DuplicateSource,
"SourceName_Copy1",
Splitter.SplitTextByCharacterTransition({"c"}, each not List.Contains({"0" .. "9"}, _)),
{"Importer", "ImporterCodePart1", "ImporterCodePart2"}
),
// 14. Clean Importer Name
CleanImporter = Table.ReplaceValue(
SplitImporter,
"ISF REPORT - ",
"",
Replacer.ReplaceText,
{"Importer"}
),
// 15. Merge importer code parts
MergeImporterCode = Table.CombineColumns(
CleanImporter,
{"ImporterCodePart1", "ImporterCodePart2"},
Combiner.CombineTextByDelimiter("", QuoteStyle.None),
"ImporterCode"
),
// 16. Clean file extension from code
CleanImporterCode = Table.ReplaceValue(
MergeImporterCode,
".xlsx",
"",
Replacer.ReplaceText,
{"ImporterCode"}
),
// 17. Reorder final columns
FinalReorder = Table.ReorderColumns(
CleanImporterCode,
{
"SourceName",
"Importer",
"ImporterCode",
"Period",
"Total ISFs Filed",
"Total On Time ISFs",
"Total Late ISFs",
"Percentage On Time",
"Percentage Late"
}
)
in
FinalReorderThis approach is much more stable in Dataflows because all lineage is contained within a single query context.
Let me know if this works for you!
Hi @rodrigosan ,
Thanks for your reply. I'm currently facing this error after saving this query
Could you please guide me on this?
Hi @dragonhood,
That specific error (There weren't enough elements...) happens because the code is trying to access the first row {0} of a table that is empty.
Why? It means that at least one of the Excel files inside your SharePoint folder does not contain a sheet named exactly "Charts". When the code tries to grab that sheet and doesn't find it, it crashes.
To fix this, we need to modify the Step 5 (ExtractCharts) to handle errors gracefully (using try...otherwise null) and add a filter immediately after to exclude files that don't have the required sheet.
Here is the updated code block. Please replace your entire query with this version:
let
// 1. Load SharePoint files using the parameter
Source = SharePoint.Files(SharepointSite, [ApiVersion = 15]),
// 2. Keep only the required folder
FilteredRows = Table.SelectRows(Source, each [Folder Path] = FolderPath),
// 3. Exclude hidden files
VisibleFiles = Table.SelectRows(FilteredRows, each [Attributes]?[Hidden]? <> true),
// 4. Read Excel content directly (Inline transformation)
ExcelContent = Table.AddColumn(VisibleFiles, "Excel", each Excel.Workbook([Content], null, true)),
// 5. Extract specific sheet "Charts" WITH ERROR HANDLING
// If the sheet "Charts" doesn't exist, it returns null instead of crashing
ExtractCharts = Table.AddColumn(
ExcelContent,
"ChartsSheet",
each try Table.SelectRows([Excel], each [Item] = "Charts" and [Kind] = "Sheet"){0}[Data] otherwise null
),
// 5.1 NEW STEP: Filter out files that didn't have the "Charts" sheet
RemoveMissingSheets = Table.SelectRows(ExtractCharts, each [ChartsSheet] <> null),
// 6. Promote headers inside each nested table
PromoteHeaders = Table.TransformColumns(
RemoveMissingSheets,
{"ChartsSheet", each Table.PromoteHeaders(_, [PromoteAllScalars = true])}
),
// 7. Rename file column to keep source traceability
RenameSource = Table.RenameColumns(PromoteHeaders, {{"Name", "SourceName"}}),
// 8. Expand data from the nested tables
ExpandedData = Table.ExpandTableColumn(
RenameSource,
"ChartsSheet",
Table.ColumnNames(PromoteHeaders{0}[ChartsSheet])
),
// 9. Set Standard types
ChangeType = Table.TransformColumnTypes(
ExpandedData,
{
{"SourceName", type text},
{"Period", type date},
{"Total ISFs Filed", Int64.Type},
{"Total On Time ISFs", Int64.Type},
{"Total Late ISFs", Int64.Type},
{"Percentage On Time", Percentage.Type},
{"Percentage Late", Percentage.Type}
}
),
// 10. Remove unneeded columns
RemoveExtraCols = Table.RemoveColumns(ChangeType, {"Column7", "Period_id"}),
// 11. Filter only valid rows
FilterValid = Table.SelectRows(
RemoveExtraCols,
each ([Percentage On Time] <> null and [Percentage Late] <> null)
),
// 12. Duplicate SourceName for split logic
DuplicateSource = Table.DuplicateColumn(FilterValid, "SourceName", "SourceName_Copy1"),
// 13. Split importer info (Transition from Text to Number)
SplitImporter = Table.SplitColumn(
DuplicateSource,
"SourceName_Copy1",
Splitter.SplitTextByCharacterTransition({"c"}, each not List.Contains({"0" .. "9"}, _)),
{"Importer", "ImporterCodePart1", "ImporterCodePart2"}
),
// 14. Clean Importer Name
CleanImporter = Table.ReplaceValue(
SplitImporter,
"ISF REPORT - ",
"",
Replacer.ReplaceText,
{"Importer"}
),
// 15. Merge importer code parts
MergeImporterCode = Table.CombineColumns(
CleanImporter,
{"ImporterCodePart1", "ImporterCodePart2"},
Combiner.CombineTextByDelimiter("", QuoteStyle.None),
"ImporterCode"
),
// 16. Clean file extension from code
CleanImporterCode = Table.ReplaceValue(
MergeImporterCode,
".xlsx",
"",
Replacer.ReplaceText,
{"ImporterCode"}
),
// 17. Reorder final columns
FinalReorder = Table.ReorderColumns(
CleanImporterCode,
{
"SourceName",
"Importer",
"ImporterCode",
"Period",
"Total ISFs Filed",
"Total On Time ISFs",
"Total Late ISFs",
"Percentage On Time",
"Percentage Late"
}
)
in
FinalReorderThis ensures that if a file is missing the specific tab, it is simply ignored instead of causing an Enumeration error.
Let me know if it runs smoothly now!
Hi @dragonhood,
if you want to completely go crazy you can add the SharePoint folder as a shortcut in a Fabric Lakehouse and process the data from there. Find here the link to the docs:
https://learn.microsoft.com/en-us/fabric/onelake/create-onedrive-sharepoint-shortcut
Best regards!
Do you get the error while editing? If yes, at what step does it trigger? Or do you get it when you try to validate or refresh?
Can you confirm that the filepath used in the Source step for #"ISF reports - New" and #"Sample File (9)" are the same? These should be identical; otherwise, you are mixing data sources, which could cause issues (although, usually you'll get errors about forumla firewall rather credential errors).
Did you create this by going through the combine files UI? Or did you perhaps copy/paste from previous combine files queries? A bit tedious, but may be worth creating a new blank test dataflow where you build the query from scratch to confirm it's reproducible and not an anomaly.
Some quick answers to your "is this caused by" questions:
H @MarkLaf ,
I'm getting the error while refreshing the dataflow and the file path used in the source step are the same.
Thanks,
Hi @dragonhood,
as of my knowledge Power BI Dataflows treat each distinct URL and each distinct connector function as a separate data source.
Maybe have a try by using exactly the same root reference in every query and function and ensure your sample file query uses the exact same site URL.
If this still does not work consider inside the transform function, to reference only the binary passed in.
Hope this helps!
Best regards!
Hi @Mauro89 thanks for your reply ,
- Parameter9
- Sample File (9)
- Transform File (9)
- Transform Sample File (9)
As I am new to Power BI Dataflows, I didn’t understand this part clearly “Dataflows treat each distinct URL and connector as a separate data source.”
Do you mean I should:
1. Store the SharePoint folder path in a table
2. Call that list in my Dataflow and use the stored path to reference files instead of writing the SharePoint folder URL separately in multiple queries/functions?
Is this the correct approach to make Dataflows treat it as one single data source and avoid refresh/credential issues?
Please confirm in simple terms.
Thanks!
Hi @Mauro89 ,
Should I store the below folder path in a table in Dataflow and reference that table in functions and ISF Reports New instead of writing this path separately in multiple queries?
You can also create a parameter in Power Query and reference the parameter in the Power Query query. No need for a table and you can reuse the parameter every time needed.
Could you please provide a demo or a screenshot on that?
Hi @dragonhood,
check out this page for parameters: Parameters - Power Query | Microsoft Learn
There it is described how to create and use them.
Best regards!
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
| User | Count |
|---|---|
| 10 | |
| 9 | |
| 6 | |
| 5 | |
| 3 |