Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
i am using an entra ID token to do a business central API call to fabrics notebook.
the token's expiry time is 1hour, i am aggregating the records imported since notebook takes from BC only 20k records. when i tried to import all data using an aggregate function :
# Your API endpoint and token (assume token variable already exists)
api_endpoint = "https://api.businesscentral.dynamics.com/..."
headers = {
"Authorization": f"Bearer {token['access_token']}",
"Content-Type": "application/json"
}
# Aggregate all records from paginated responses
all_records = []
current_url = api_endpoint
while current_url:
response = requests.get(current_url, headers=headers)
if response.status_code != 200:
print("Error fetching data:", response.text)
break
data = response.json()
# Append the fetched records
records = data.get("value", [])
all_records.extend(records)
# Find the URL for the next page, if there is one
current_url = data.get("@odata.nextLink")
# Create a Pandas DataFrame with all the aggregated records
df = pd.DataFrame(all_records)
which is taking more than the token's expiry time, after that i get an error :
Error fetching data: <error xmlns="http://docs.oasis-open.org/odata/ns/metadata"><code>Unauthorized</code><message>The credentials provided are incorrect</message></error>
and only get around half the data (my table is 17m rows, i got 8.5m).
what can i do in this situation?
PS: the reason i am using notebook is because when we tried to use dataflow gen2, we always got an error when fetching the main fact table, we tried turning on fast copy, but it doesnt work with business central (here is a vlog that samples the error we got with dataflow link ) , and we tried connecting the BC api to a data pipeline instead of a dataflow, but data pipeline doesnt take business central API as a source. so our only option that we were able to find was notebook. if there are any other, simpler options please share them
Solved! Go to Solution.
Using a single access token for a long-running pagination process can lead to token expiration before all records are fetched, which is why you see the “Unauthorized” error and only half the expected data.
Instead of relying on a single token throughout the entire process, incorporate logic to check the token’s expiry and refresh it before it expires. This is commonly done using a refresh token or leveraging a library like MSAL which automatically handles token renewal.
Hi @Hussein_charif,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @Hussein_charif,
May I ask if you have resolved this issue? If so, please mark it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @Hussein_charif ,
Thank you for posting your query in the Microsoft Fabric Community Forum, and thanks to @nilendraFabric for sharing valuable insights.
Could you please confirm if your query has been resolved by the provided solution? If so, please mark it as the solution. This will help other community members solve similar problems faster.
Thank you.
Using a single access token for a long-running pagination process can lead to token expiration before all records are fetched, which is why you see the “Unauthorized” error and only half the expected data.
Instead of relying on a single token throughout the entire process, incorporate logic to check the token’s expiry and refresh it before it expires. This is commonly done using a refresh token or leveraging a library like MSAL which automatically handles token renewal.