Skip to main content
PATCH
/
v2
/
ai-integrations
/
{integration_id}
Update an AI integration
curl --request PATCH \
  --url https://api.arize.com/v2/ai-integrations/{integration_id} \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "name": "Updated OpenAI Integration",
  "api_key": null,
  "model_names": [
    "gpt-4o",
    "gpt-4o-mini"
  ]
}
'
{
  "id": "TGxtSW50ZWdyYXRpb246MTI6YUJjRA==",
  "name": "My OpenAI Integration",
  "provider": "openAI",
  "has_api_key": true,
  "model_names": [
    "gpt-4",
    "gpt-4o"
  ],
  "enable_default_models": true,
  "function_calling_enabled": true,
  "auth_type": "default",
  "scopings": [
    {}
  ],
  "created_at": "2026-02-13T21:27:19.055Z",
  "updated_at": "2026-02-13T21:27:19.279Z",
  "created_by_user_id": "VXNlcjoxOm5OYkM="
}

Authorizations

Authorization
string
header
required

Most Arize AI endpoints require authentication. For those endpoints that require authentication, include your API key in the request header using the format

Path Parameters

integration_id
string
required

The unique identifier of the AI integration A universally unique identifier

Example:

"RW50aXR5OjEyMzQ1"

Body

application/json

Body containing AI integration update parameters. At least one field must be provided.

name
string

New integration name

provider
enum<string>

The AI provider for this integration

Available options:
openAI,
azureOpenAI,
awsBedrock,
vertexAI,
anthropic,
custom,
nvidiaNim,
gemini
api_key
string | null

New API key. Pass null to remove the existing key. Omit to keep unchanged.

base_url
string | null

Custom base URL. Pass null to remove.

model_names
string[]

Supported model names (replaces all)

headers
object

Custom headers. Pass null to remove.

enable_default_models
boolean

Enable provider's default model list

function_calling_enabled
boolean

Enable function/tool calling

auth_type
enum<string>

The authentication method for this integration

Available options:
default,
proxy_with_headers,
bearer_token
provider_metadata
object

Provider-specific configuration

scopings
object[]

Visibility scoping rules (replaces all existing scopings)

Response

An AI integration object

An AI integration configures access to an external LLM provider (e.g. OpenAI, Azure OpenAI, AWS Bedrock, Vertex AI). Integrations can be scoped to the entire account, a specific organization, or a specific space.

id
string
required

The integration ID

name
string
required

The integration name

provider
enum<string>
required

The AI provider for this integration

Available options:
openAI,
azureOpenAI,
awsBedrock,
vertexAI,
anthropic,
custom,
nvidiaNim,
gemini
has_api_key
boolean
required

Whether an API key is configured (the key itself is never returned)

enable_default_models
boolean
required

Whether the provider's default model list is enabled

function_calling_enabled
boolean
required

Whether function/tool calling is enabled

auth_type
enum<string>
required

The authentication method for this integration

Available options:
default,
proxy_with_headers,
bearer_token
scopings
object[]
required

Visibility scoping rules

created_at
string<date-time>
required

When the integration was created

updated_at
string<date-time>
required

When the integration was last updated

created_by_user_id
string
required

The user ID of the user who created the integration

base_url
string | null

Custom base URL for the provider

model_names
string[] | null

Supported model names

headers
object

Custom headers included in requests

provider_metadata
object

Provider-specific configuration (AWS or GCP metadata)