Skip to main content
POST
/
v2
/
ai-integrations
Create an AI integration
curl --request POST \
  --url https://api.arize.com/v2/ai-integrations \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "name": "Production OpenAI",
  "provider": "openAI",
  "api_key": "sk-abc123...",
  "model_names": [
    "gpt-4",
    "gpt-4o"
  ],
  "enable_default_models": true,
  "scopings": [
    {
      "organization_id": "QWNjb3VudE9yZzoxMjM6YWJj",
      "space_id": null
    }
  ]
}
'
{
  "id": "TGxtSW50ZWdyYXRpb246MTI6YUJjRA==",
  "name": "My OpenAI Integration",
  "provider": "openAI",
  "has_api_key": true,
  "model_names": [
    "gpt-4",
    "gpt-4o"
  ],
  "enable_default_models": true,
  "function_calling_enabled": true,
  "auth_type": "default",
  "scopings": [
    {}
  ],
  "created_at": "2026-02-13T21:27:19.055Z",
  "updated_at": "2026-02-13T21:27:19.279Z",
  "created_by_user_id": "VXNlcjoxOm5OYkM="
}

Authorizations

Authorization
string
header
required

Most Arize AI endpoints require authentication. For those endpoints that require authentication, include your API key in the request header using the format

Body

application/json

Body containing AI integration creation parameters

name
string
required

Integration name

provider
enum<string>
required

The AI provider for this integration

Available options:
openAI,
azureOpenAI,
awsBedrock,
vertexAI,
anthropic,
custom,
nvidiaNim,
gemini
api_key
string

API key for the provider (write-only, never returned)

base_url
string

Custom base URL for the provider

model_names
string[]

Supported model names

headers
object

Custom headers to include in requests

enable_default_models
boolean

Enable provider's default model list (default false)

function_calling_enabled
boolean

Enable function/tool calling (default true)

auth_type
enum<string>

The authentication method for this integration

Available options:
default,
proxy_with_headers,
bearer_token
provider_metadata
object

Provider-specific configuration (AWS or GCP metadata)

scopings
object[]

Visibility scoping rules. Defaults to account-wide.

Response

An AI integration object

An AI integration configures access to an external LLM provider (e.g. OpenAI, Azure OpenAI, AWS Bedrock, Vertex AI). Integrations can be scoped to the entire account, a specific organization, or a specific space.

id
string
required

The integration ID

name
string
required

The integration name

provider
enum<string>
required

The AI provider for this integration

Available options:
openAI,
azureOpenAI,
awsBedrock,
vertexAI,
anthropic,
custom,
nvidiaNim,
gemini
has_api_key
boolean
required

Whether an API key is configured (the key itself is never returned)

enable_default_models
boolean
required

Whether the provider's default model list is enabled

function_calling_enabled
boolean
required

Whether function/tool calling is enabled

auth_type
enum<string>
required

The authentication method for this integration

Available options:
default,
proxy_with_headers,
bearer_token
scopings
object[]
required

Visibility scoping rules

created_at
string<date-time>
required

When the integration was created

updated_at
string<date-time>
required

When the integration was last updated

created_by_user_id
string
required

The user ID of the user who created the integration

base_url
string | null

Custom base URL for the provider

model_names
string[] | null

Supported model names

headers
object

Custom headers included in requests

provider_metadata
object

Provider-specific configuration (AWS or GCP metadata)