Azure OpenAI

The Azure OpenAI connection allows the Intelligence Hub to interact with Azure-hosted OpenAI models for generative AI and LLM workflows. This connection enables sending prompts and receiving responses from models deployed in Azure. Below are details on the connection and input settings.

Use care when integrating this connection into production workflows. Due to the rapid pace of AI and LLM development, providers have made breaking changes to their APIs in the past. While we aim to respond quickly to such changes, they may temporarily disrupt production systems.

Connection Settings

Setting Description
Key The API key used to authenticate with the Azure OpenAI service.
Endpoint The URL for the Azure OpenAI API endpoint.

Below is an example screenshot of where to find these settings in Azure.

Azure OpenAI Connection Settings

Input Settings

Setting Description
Deployment The deployment name or ID for the model to use. This is typically a deployment name provided by Azure OpenAI.
Instructions (Optional) The prompt or instructions to guide the model’s response.
Message The message to be sent to the model. This is the input the model will process based on the instructions.
Response Format The format in which the model should respond. When set to JSON the connection will attempt to parse the response as JSON. If the response is not valid JSON the read fails.