Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Some of you might want to use the Azure Open AI proxy outside of a notebook. Maybe in some custom code or in Powerautomate.
It's easy, if you know the url...
1) use the correct url to post a chat completion request: https://polite-ground-030dc3103.4.azurestaticapps.net/api/v1/openai/deployments/gpt-35-turbo-16k/cha...
replace the model name with whatever one you want to play with. Careful, the /chat/ part of the url is paramount
2) add a header to the request: api-key
3) add your api key as the header value
4) construct the body using the messages syntax. a good example can be found here: https://microsoft.github.io/azure-openai-service-proxy/raw-api-access/chat-completion/
careful! the last link has an outdated url structure. make sure to use
https://polite-ground-030dc3103.4.azurestaticapps.net/api/v1/openai/deployments/{model-name}/chat/completions
and voilla you can use the openai api also in other places like powerautomate 🙂