Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hello Community,
Its hard to study Fabric in this changing environment! Can someone explain to me please the changes in AI Foundry? I am trying to follow the workshop which I received at FabCon (either Vegas or Vienna!) and the notebooks are failing.
Chat-GPT tells me this:
• Microsoft has two co-existing ways to reach GPT models in Azure right now.
• The notebook you're using comes from the Fabric / Al Inference lab series.
Those labs were authored for the new "Azure Al Inference" endpoint, not for classic Azure OpenAl resources.
Are there plans to update this material?
Thanks
Michele
Thanks for your message Tejaswi, that clears things up a bit.
I will check back later in hope that the lab materials have been updated.
Best, Michele
You’re very welcome @MicheleParisi ,
I’m really glad that helped. Checking back later is definitely the right call the team is in the process of updating the lab materials to match the latest Azure AI Foundry changes. Once the new versions are live, those notebooks should run smoothly again. Thanks for hanging in there while everything gets updated.
Thank you,
Tejaswi
Hi @MicheleParisi ,
I wanted to follow up and see if you had a chance to review the information shared. If you have any further questions or need additional assistance, feel free to reach out.
Thank you.
Hi @MicheleParisi ,
You’re totally right things are changing quickly in this space, and it’s tough to keep up while you’re studying. The updates in AI Foundry are happening because Microsoft is moving from the old Azure OpenAI setup to the newer Azure AI Foundry (aka Azure AI Inference). Both systems are around for now, but they use different endpoints and SDKs.
The old Azure OpenAI endpoints look like https://<name>.openai.azure.com/, while the new AI Foundry/Inferences ones use https://<region>.models.ai.azure.com/. The workshop notebooks from FabCon were made for AI Foundry, so if your setup is still on the old Azure OpenAI, the notebook calls won’t work until you switch to the new Azure AI Inference config.
Here’s what you can do: check your endpoint type in the Azure portal and make sure you’ve got the right client library. If you’re using AI Foundry, install or update the azure-ai-inference package and update your notebook to use the Foundry endpoint listed in your resource overview.
You’ll need to set the Foundry project endpoint and model deployment name in the notebook usually that sorts things out. If you’re still on the classic Azure OpenAI, you can either make a new AI Foundry workspace or tweak the notebook to use the older Azure OpenAI SDK.
The good news is Microsoft is updating the workshop materials to match these changes. You can check the official Azure AI Foundry documentation and the official GitHub workshop repository for the latest updates or any open issues about endpoints.
Hope this clears things up and helps you get your notebooks working again.
Best Regards,
Tejaswi.
Community Support
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 14 | |
| 7 | |
| 6 | |
| 4 | |
| 3 |
| User | Count |
|---|---|
| 29 | |
| 16 | |
| 14 | |
| 13 | |
| 9 |