Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi Guys,
I have uploaded my desktop fil to Power BI in the cloud but it won't allow a scheduled refresh and gives the following error.
You can't schedule refresh for this semantic model because the following data sources currently don't support refresh:
Here is Function Query1 , I have over 30 other quries that call it and this list will grow.
(tenant_name,site_name,list_name)=>
let
tenantname = tenant_name,
sitename = site_name, // if a subsite use "Site/SubSite"
listname = list_name,
baseurl = "https://" & tenantname & "/sites/" & sitename & "/_api/web/lists/GetByTitle('" & listname & "')/",
itemcount = Json.Document(Web.Contents(baseurl&"ItemCount", [Headers=[Accept="application/json"]]))[value],
skiplist = List.Numbers(0, Number.RoundUp(itemcount/5000), 5000),
#"Converted to Table" = Table.FromList(skiplist, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Renamed Columns" = Table.RenameColumns(#"Converted to Table",{{"Column1", "Skip"}}),
#"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"Skip", type text}}),
fieldselect = "&$top=5000", // all fields with no expansion
//fieldselect = "&$top=5000&$select = Id,Title,Person,Date", // list desired fields (no expansion)
//fieldselect = "&$top=5000&$select = Id,Title,Person/LastName,Person/FirstName,Date&$expand = Person", //expand list fields
#"Added Custom" = Table.AddColumn(#"Changed Type", "Items", each Json.Document(Web.Contents(baseurl& "/items?$skipToken=Paged=TRUE%26p_ID=" & [Skip] & fieldselect, [Headers=[Accept="application/json"]]))),
#"Expanded Items" = Table.ExpandRecordColumn(#"Added Custom", "Items", {"value"}, {"value"}),
#"Expanded value" = Table.ExpandListColumn(#"Expanded Items", "value")
in
#"Expanded value"
How can this be amended to work online and scheduled refreshes?
Thanks
Solved! Go to Solution.
Hi @jbrines ,
Thanks for the follow-up and for highlighting the issue with $skiptoken.
You're right, in the original script, $skip was used, and $skiptoken wasn’t present initially. When both appear together in the request URL (even unintentionally), SharePoint returns an error, as it doesn’t support using both parameters at the same time.
That said, it's great to hear you found a working solution using SharePoint.Tables as discussed here. This is the recommended approach when working with large SharePoint lists, as it handles paging internally and avoids the 5000-item threshold issues altogether.
Thanks again for your engagement on this topic.
Please consider accepting it as a solution so others with similar scenarios can benefit as well.
Regards,
Vinay,
Community Support Team.
Hi @jbrines ,
Thanks for engaging with the Fabric Community. Apologies for the delayed response.
I just wanted to follow up and check if the issue has been resolved. Thanks to @Greg_Deckler for the helpful suggestions.
If the issue is still unresolved, here are some alternative approaches that might work for your scenario:
1. Instead of function query, you can try using List.Generate() to iterate through the pages within a single query. This allows you to fetch paginated data without needing dynamic URL generation.
List.Generate - PowerQuery M | Microsoft Learn
2. If the SharePoint API limits or scheduled refresh restrictions persist, consider preloading the data into a Power BI Dataflow first. This allows Power BI to handle the data transformation before importing it into your report, ensuring stable scheduled refreshes.
3. The 404 error you encountered suggests there might be an issue with the API URL. Double-check that the constructed URL matches the SharePoint API documentation and that it's tested outside Power BI to ensure it works correctly.
For further details on managing large SharePoint lists and libraries, you can refer to this Manage large lists and libraries - Microsoft Support
Please let us know if any of these approaches help or if you need further assistance.
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and a kudos would be appreciated.
Best regards,
Vinay.
Hi @v-veshwara-msft ,
Thanks for getting back to me.
I am fairly new to Power BI as you can see.
Some of the ist will be over 10/15k in rows
Could you give me an example on how the List Generate would help me?
Thanks
John.
Hi @jbrines,
Thanks for your patience.
List.Generate() automates fetching multiple pages of data until all records are retrieved.
Here is an example script that fetches data from your SharePoint list:
let
TenantName = "yourtenant.sharepoint.com",
SiteName = "YourSite",
ListName = "YourList",
PageSize = 5000,
BaseUrl = "https://" & TenantName & "/sites/" & SiteName & "/_api/web/lists/GetByTitle('" & ListName & "')/ItemCount",
ItemCount = Json.Document(Web.Contents(BaseUrl, [Headers=[Accept="application/json"]]))[value],
// Function to generate pages
GetData = List.Generate(
() => [Skip = 0, Data = {}],
each [Skip] < ItemCount,
each [
Skip = [Skip] + PageSize,
Data = Json.Document(Web.Contents("https://" & TenantName & "/sites/" & SiteName & "/_api/web/lists/GetByTitle('" & ListName & "')/items?$top=" & Number.ToText(PageSize) & "&$skiptoken=Paged=TRUE%26p_ID=" & Number.ToText([Skip]), [Headers=[Accept="application/json"]]))[value]
],
each [Data]
),
// Convert list to table and expand records
TableData = Table.FromList(GetData, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
ExpandedTable = Table.ExpandListColumn(TableData, "Column1")
in
ExpandedTable
Please try the above script in Advanced Editor and let us know if you face any issues.
If you want a more detailed breakdown of List.Generate(), check out this Microsoft guide: List.Generate - PowerQuery M | Microsoft Learn
If this helps, please consider accepting it as a solution so others can find it easily.
Thanks,
Vinay.
Hi @v-veshwara-msft ,
Tried it on a site that has 5295 items and it is only showing 5001. What needs to be done to get the rest?
Thanks
Hi jbrines,
Apologies for the delayed response, and thank you for your patience.
Hope your issue is resolved. If not, please consider the following.
The behavior where only 5001 out of 5295 items are returned suggests that the $skiptoken logic may not be advancing correctly. This often occurs when the p_ID used in the token doesn’t reference the actual last item ID from the previous page.
To ensure accurate pagination, the script needs to dynamically capture the last item’s ID from each page and use that as the starting point for the next. The earlier logic used a precomputed skip value, which isn’t compatible with SharePoint’s paging mechanism.
I'll validate a revised version of the script using List.Generate() with ID-based pagination and share a working solution after testing with a large dataset.
Thanks again for following up.
Best regards,
Vinay.
Hi @v-veshwara-msft I did find the following that appears to be working, not sure if it is the right way.
let
Source = SharePoint.Tables("https://clarkcontracts.sharepoint.com/sites/CE4080-PitfodelsWoodCareHomeAberdeen", [Implementation="2.0"]),
List = Source{[Title="Site Sign In"]}[Items],
PageSize = 5000,
GetPage = (PageNumber as number) =>
let
Skip = PageNumber * PageSize,
Page = OData.Feed("https://clarkcontracts.sharepoint.com/sites/CE4080-PitfodelsWoodCareHomeAberdeen/_api/web/lists/getbytitle('Site Sign In')/items?$top=" & Number.ToText(PageSize) & "&$skip=" & Number.ToText(Skip))
in
Page,
Pages = List.Generate(() => 0, each _ < 10, each _ + 1, each GetPage(_)),
CombinedPages = Table.Combine(Pages)
in
CombinedPages
Hi @jbrines ,
Thanks for following up and sharing your approach.
The solution you've implemented is valid and works well when the number of pages is known or capped (as in your loop up to 10). However, just a note that SharePoint's OData API may not always support $skip reliably beyond certain thresholds (especially with large lists), and behavior can vary across environments.
For more robust pagination, especially when item counts are dynamic or exceed 5000 rows, using $skiptoken based on item ID is generally recommended, as it aligns better with how SharePoint handles large data sets internally.
That said, if your current solution consistently returns all rows without issues in your scenario, it's absolutely fine to proceed with it.
If any of the responses helped, please consider accepting it as the solution to assist others facing similar scenarios.
Thank you.
Hi @v-veshwara-msft it turns out the code wasn't working as the records are over 5000 and I get the following error. I cannot see where $skiptoken is being used.
DataSource.Error: OData: Request failed: The remote server returned an error: (400) Bad Request. (The $skip and $skiptoken cannot be specified at the same time.)
Details:
DataSourceKind=OData
DataSourcePath=https://clarkcontracts.sharepoint.com/sites/CE4080-PitfodelsWoodCareHomeAberdeen/_api/web/lists/getbytitle(%27Site%20Sign%20In%27)/items
sprequestguid=7242a0a1-d00b-c000-7e49-b52fa3e03e65
Url=https://clarkcontracts.sharepoint.com/sites/CE4080-PitfodelsWoodCareHomeAberdeen/_api/web/lists/getbytitle('Site Sign In')/items?%24skiptoken=Paged%3dTRUE%26p_ID%3d5017&%24top=5000&%24skip=0
If I chnage $skip to $skiptoken in the code I get a lot of duplication of records
John.
Hi @jbrines ,
Thanks for the follow-up and for highlighting the issue with $skiptoken.
You're right, in the original script, $skip was used, and $skiptoken wasn’t present initially. When both appear together in the request URL (even unintentionally), SharePoint returns an error, as it doesn’t support using both parameters at the same time.
That said, it's great to hear you found a working solution using SharePoint.Tables as discussed here. This is the recommended approach when working with large SharePoint lists, as it handles paging internally and avoids the 5000-item threshold issues altogether.
Thanks again for your engagement on this topic.
Please consider accepting it as a solution so others with similar scenarios can benefit as well.
Regards,
Vinay,
Community Support Team.
@jbrines I would move the &"ItemCount" into your baseurl and see if that fixes it.
Chris Webb's BI Blog: Web.Contents(), M Functions And Dataset Refresh Errors In Power BI
Hi @Greg_Deckler , so basically you would remove the comman and use the & to join it to the base url?
I shoud have said up front this isn't my code I found it online as I have multiple lists and some have more that 5000 lines.
@jbrines I was thinking to change these two lines:
baseurl = "https://" & tenantname & "/sites/" & sitename & "/_api/web/lists/GetByTitle('" & listname & "')/ItemCount",
itemcount = Json.Document(Web.Contents(baseurl, [Headers=[Accept="application/json"]]))[value],
Hi @Greg_Deckler ,
You are getting there. Now once I make the changes the other queries come up withthe following error.
DataSource.Error: Web.Contents failed to get contents from 'https://clarkcontracts.sharepoint.com/sites/CE4077-SilverburnParkFlaxMillLeven/_api/web/lists/GetByTitle('Site%20Sign%20In')/ItemCount/items?$skipToken=Paged=TRUE%26p_ID=0&$top=5000' (404): Not Found
Details:
DataSourceKind=Web
DataSourcePath=https://clarkcontracts.sharepoint.com/sites/CE4077-SilverburnParkFlaxMillLeven/_api/web/lists/GetByTitle('Site%20Sign%20In')/ItemCount/items
Url=https://clarkcontracts.sharepoint.com/sites/CE4077-SilverburnParkFlaxMillLeven/_api/web/lists/GetByTitle('Site%20Sign%20In')/ItemCount/items?$skipToken=Paged=TRUE%26p_ID=0&$top=5000
@jbrines OK, that is a 404 not found error so there is something wrong with the URL. So, you'll have to figure out what the correct URL is for what you want and then make sure that you construct it completely and accurately before turning Web.Contents loose on it. As long as you don't have any "dynamic" components to the URL you use in Web.Contents (use the query options if you need to feed in query string parameters) then you shouldn't get the error where it does not allow you to refresh the source.
This is the correct URL
https://clarkcontracts.sharepoint.com/sites/CE4077-SilverburnParkFlaxMillLeven/Lists/Site%20Sign%20In/AllItems.aspx
The issue is that some of the lists have more that 5000 (limit) and that is why I was using that function query.
I suppose if there was a way to change the query so it didn't need the function query then I could use the query
let
Source = Query1("clarkcontracts.sharepoint.com", "CE4077-SilverburnParkFlaxMillLeven", "Site Sign In"),
#"Expanded value" = Table.ExpandRecordColumn(Source, "value", {"Title", "Modified", "Created", "Company", "SignInDateandTime", "SignOutDateandTime", "Profession", "AreyouaDriver_x002f_Passenger", "PostcodeStart", "ProjectPostcode", "Distance_x0028_includingreturnjo", "SignInLocation", "SignOutLocation", "Contract", "ModeofTransport", "Created_x0020_Date0", "Created_x0020_Time", "Modified_x0020_Time", "Sign_x0020_Out_x0020_Time", "Hours", "AuthorId", "EditorId"}, {"value.Title", "value.Modified", "value.Created", "value.Company", "value.SignInDateandTime", "value.SignOutDateandTime", "value.Profession", "value.AreyouaDriver_x002f_Passenger", "value.PostcodeStart", "value.ProjectPostcode", "value.Distance_x0028_includingreturnjo", "value.SignInLocation", "value.SignOutLocation", "value.Contract", "value.ModeofTransport", "value.Created_x0020_Date0", "value.Created_x0020_Time", "value.Modified_x0020_Time", "value.Sign_x0020_Out_x0020_Time", "value.Hours", "value.AuthorId", "value.EditorId"}),
#"Trimmed Text" = Table.TransformColumns(Table.TransformColumnTypes(#"Expanded value", {{"value.Company", type text}}, "en-GB"),{{"value.Title", Text.Proper, type text}}),
#"Replaced Value" = Table.ReplaceValue(#"Trimmed Text",null,"CE4077",Replacer.ReplaceValue,{"value.Contract"}),
#"Changed Type" = Table.TransformColumnTypes(#"Replaced Value",{{"value.Modified", type datetime}, {"value.Created", type datetime}, {"value.SignInDateandTime", type datetime}, {"value.SignOutDateandTime", type datetime}}),
#"Renamed Columns" = Table.RenameColumns(#"Changed Type",{{"value.Title", "Name"}, {"value.Modified", "Modified"}, {"value.Created", "Created"}, {"value.Company", "Company"}, {"value.SignInDateandTime", "Sign In Date and Time"}, {"value.SignOutDateandTime", "Sign Out Date and Time"}, {"value.Profession", "Profession"}, {"value.SignInLocation", "Sign In Location"}, {"value.SignOutLocation", "Sign Out Location"}, {"value.ModeofTransport", "Mode of Transport"}, {"value.AreyouaDriver_x002f_Passenger", "Are you a Driver or Passenger"}, {"value.PostcodeStart", "Postcode Start"}, {"value.ProjectPostcode", "Project Postcode"}, {"value.Distance_x0028_includingreturnjo", "Distance includingr eturn journey"}, {"value.Contract", "Contract"}}),
#"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"Skip"}),
#"Renamed Columns1" = Table.RenameColumns(#"Removed Columns",{{"Distance includingr eturn journey", "Distance including return journey"}, {"value.Created_x0020_Date0", "Created Date"}, {"value.Created_x0020_Time", "Created Time"}, {"value.Modified_x0020_Time", "Modified Time"}, {"value.Sign_x0020_Out_x0020_Time", "Sign Out Time"}, {"value.Hours", "Hours"}}),
#"Replaced Value1" = Table.ReplaceValue(#"Renamed Columns1","Clark contracts","Clark Contracts",Replacer.ReplaceValue,{"Company"}),
#"Replaced Value2" = Table.ReplaceValue(#"Replaced Value1","CCL-Search","Search",Replacer.ReplaceValue,{"Company"}),
#"Replaced Value3" = Table.ReplaceValue(#"Replaced Value2","Ccl","Clark Contracts",Replacer.ReplaceValue,{"Company"}),
#"Replaced Value4" = Table.ReplaceValue(#"Replaced Value3","Clarks ","Clark Contracts",Replacer.ReplaceValue,{"Company"}),
#"Replaced Value5" = Table.ReplaceValue(#"Replaced Value4","Clark contracts ","Clark Contracts",Replacer.ReplaceValue,{"Company"}),
#"Replaced Value6" = Table.ReplaceValue(#"Replaced Value5","Clark Contracts Ltd ","Clark Contracts",Replacer.ReplaceValue,{"Company"}),
#"Replaced Value7" = Table.ReplaceValue(#"Replaced Value6","Clark Contracts Ltd","Clark Contracts",Replacer.ReplaceValue,{"Company"}),
#"Replaced Value8" = Table.ReplaceValue(#"Replaced Value7","Clark Contracts ","Clark Contracts",Replacer.ReplaceValue,{"Company"}),
#"Replaced Value9" = Table.ReplaceValue(#"Replaced Value8","CCL","Clark Contracts",Replacer.ReplaceValue,{"Company"})
in
#"Replaced Value9"
Any ideas on how o change the cose below so it brings in all the items from the list but doesn't need the Function Query.
Hi @Greg_Deckler ,
Changed it to this and still getting the error, appologies if this isn't what you were suggesting.
(tenant_name,site_name,list_name)=>
let
tenantname = tenant_name,
sitename = site_name, // if a subsite use "Site/SubSite"
listname = list_name,
baseurl = "https://" & tenantname & "/sites/" & sitename & "/_api/web/lists/GetByTitle('" & listname & "')/ItemCount",
itemcount = Json.Document(Web.Contents(baseurl&"ItemCount", [Headers=[Accept="application/json"]]))[value],
skiplist = List.Numbers(0, Number.RoundUp(itemcount/5000), 5000),
#"Converted to Table" = Table.FromList(skiplist, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Renamed Columns" = Table.RenameColumns(#"Converted to Table",{{"Column1", "Skip"}}),
#"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"Skip", type text}}),
fieldselect = "&$top=5000", // all fields with no expansion
//fieldselect = "&$top=5000&$select = Id,Title,Person,Date", // list desired fields (no expansion)
//fieldselect = "&$top=5000&$select = Id,Title,Person/LastName,Person/FirstName,Date&$expand = Person", //expand list fields
#"Added Custom" = Table.AddColumn(#"Changed Type", "Items", each Json.Document(Web.Contents(baseurl& "/items?$skipToken=Paged=TRUE%26p_ID=" & [Skip] & fieldselect, [Headers=[Accept="application/json"]]))),
#"Expanded Items" = Table.ExpandRecordColumn(#"Added Custom", "Items", {"value"}, {"value"}),
#"Expanded value" = Table.ExpandListColumn(#"Expanded Items", "value")
in
#"Expanded value"
@jbrines Looks like the first part you got right but this line needs to be this instead:
itemcount = Json.Document(Web.Contents(baseurl, [Headers=[Accept="application/json"]]))[value],
In other words, you don't want to add anything to the url that you are feeding to Web.Contents. The url (baseurl in your case) should be fully formed prior to feeding it to Web.Contents.
Check out the July 2025 Power BI update to learn about new features.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
10 | |
7 | |
7 | |
6 | |
6 |