Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I've registered my model in Model Endpoint successfully, but I can not call the model thorugh that endpoint. i do not know where to get the fabric token. where should i find it. i tried to create a registration app and use that token but it did not work.
point
Solved! Go to Solution.
Hi @bao_phan ,
Thank you for reaching out to Microsoft Community.
To call your model through a Fabric Model Endpoint, you need a Microsoft Entra ID formerly Azure AD access token with the appropriate permissions. Here's how to set it up properly:
Register an application in Microsoft Entra ID.
Add API permissions for Microsoft Fabric by including the scope:
https://api.fabric.microsoft.com/.default
you use it when requesting a token, This is a permission scope, used only when acquiring a token from Microsoft Entra ID (Azure AD). It tells Entra ID to issue a token that includes all application permissions the app has been granted for Microsoft Fabric.
Generate the access token using the Microsoft Authentication Library (MSAL) or a tool like Postman:
For delegated access (user context), use AcquireTokenInteractive.
For application-level access (service principal), use AcquireTokenForClient along with the client secret.
Please refer to these documents and blogs for troubleshooting your issue: Serve real-time predictions with ML model endpoints (Preview) - Microsoft Fabric | Microsoft Learn
Using FabricRestClient To Make Fabric REST API Calls
c# - Generating token for Fabric Rest api using client secret - Stack Overflow
Hope this helps.
Best regards,
Chaithra E.
Hi @bao_phan ,
Thank you for reaching out to Microsoft Community.
To call your model through a Fabric Model Endpoint, you need a Microsoft Entra ID formerly Azure AD access token with the appropriate permissions. Here's how to set it up properly:
Register an application in Microsoft Entra ID.
Add API permissions for Microsoft Fabric by including the scope:
https://api.fabric.microsoft.com/.default
you use it when requesting a token, This is a permission scope, used only when acquiring a token from Microsoft Entra ID (Azure AD). It tells Entra ID to issue a token that includes all application permissions the app has been granted for Microsoft Fabric.
Generate the access token using the Microsoft Authentication Library (MSAL) or a tool like Postman:
For delegated access (user context), use AcquireTokenInteractive.
For application-level access (service principal), use AcquireTokenForClient along with the client secret.
Please refer to these documents and blogs for troubleshooting your issue: Serve real-time predictions with ML model endpoints (Preview) - Microsoft Fabric | Microsoft Learn
Using FabricRestClient To Make Fabric REST API Calls
c# - Generating token for Fabric Rest api using client secret - Stack Overflow
Hope this helps.
Best regards,
Chaithra E.
Hi @bao_phan
If you want to call your Fabric model endpoint, you’ll need to get a Microsoft Entra token with the right permissions. The easiest way is to register an app in Entra ID, add the Fabric API permission (https://api.fabric.microsoft.com/.default), and then use MSAL to get the token. Once you have it, just pass it in your request as a Bearer token.
Regards,
Mehrdad Abdollahi
Hi @mabdollahi ,
Thanks for your response. You're right, I tried to get the token direct through the mssparkutils.credentials.getToken() and it's good to go. But now i want to call the model inside the KQL, and KQL does not support mssparkutils; let me try your option to see whether i can make it work even when in KQL (since i just write the code to test in notebook, but it must be capable to run in KQL also).
Hi @bao_phan
Hope all is well on your end.
Did you win in KQL? I'd like to hear about your experience.
Regards,
Mehrdad
Hi @mabdollahi , sorry for the late response! I was able to call the model in KQL. There were 2 options to achieve this. First, we can dump the model as binary into KQL database then call it via KQL database. Otherwise, we have to host it in ML endpoint in Fabric and call it via requests. (for first method, it only suitable for small model since mine was pretty large so it took time to dump the model into KQL DB and read it. And i want to minimum that time so that the prediction could be in low latency).