The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Some of you might want to use the Azure Open AI proxy outside of a notebook. Maybe in some custom code or in Powerautomate.
It's easy, if you know the url...
1) use the correct url to post a chat completion request: https://polite-ground-030dc3103.4.azurestaticapps.net/api/v1/openai/deployments/gpt-35-turbo-16k/cha...
replace the model name with whatever one you want to play with. Careful, the /chat/ part of the url is paramount
2) add a header to the request: api-key
3) add your api key as the header value
4) construct the body using the messages syntax. a good example can be found here: https://microsoft.github.io/azure-openai-service-proxy/raw-api-access/chat-completion/
careful! the last link has an outdated url structure. make sure to use
https://polite-ground-030dc3103.4.azurestaticapps.net/api/v1/openai/deployments/{model-name}/chat/completions
and voilla you can use the openai api also in other places like powerautomate 🙂