Skip to main content
Manage prompt templates and their versions programmatically. Create versioned prompts, assign labels to specific versions, and retrieve prompts by label for use in production workflows.

Key Capabilities

  • Create and manage prompt templates within spaces
  • Version prompts with commit messages
  • Assign and resolve labels (e.g. "production", "staging") to specific versions
  • List prompts and versions with pagination support
  • Retrieve, update, and delete prompts

List Prompts

prompts operations are currently in ALPHA. A one-time warning is emitted on first use.
List all prompts you have access to, with optional filtering by space.
resp = client.prompts.list(
    space="your-space-name-or-id",  # optional
    name="customer-support",        # optional substring filter
    limit=50,
)

for prompt in resp.prompts:
    print(prompt.id, prompt.name)
For details on pagination, field introspection, and data conversion (to dict/JSON/DataFrame), see Response Objects.

Create a Prompt

Create a new prompt with an initial version. Prompt names must be unique within the target space.
from arize._generated.api_client.models.input_variable_format import InputVariableFormat
from arize._generated.api_client.models.llm_provider import LlmProvider
from arize._generated.api_client.models.llm_message import LLMMessage

prompt = client.prompts.create(
    space="your-space-name-or-id",
    name="customer-support-agent",
    description="Handles tier-1 customer support queries",
    commit_message="Initial version",
    input_variable_format=InputVariableFormat.F_STRING,
    provider=LlmProvider.OPENAI,
    model="gpt-4o",
    messages=[
        LLMMessage(role="system", content="You are a helpful customer support agent for {company_name}."),
        LLMMessage(role="user", content="{user_query}"),
    ],
)

print(prompt.id, prompt.name)

With Invocation Parameters

Set default inference parameters alongside the prompt template.
from arize._generated.api_client.models.invocation_params import InvocationParams

prompt = client.prompts.create(
    space="your-space-name-or-id",
    name="summarizer",
    commit_message="Initial version",
    input_variable_format=InputVariableFormat.F_STRING,
    provider=LlmProvider.OPENAI,
    model="gpt-4o-mini",
    messages=[
        LLMMessage(role="user", content="Summarize the following text: {text}"),
    ],
    invocation_params=InvocationParams(temperature=0.2, max_tokens=512),
)

Get a Prompt

Retrieve a prompt by name or ID. By default the latest version is returned. When using a name, provide space to disambiguate.
prompt = client.prompts.get(
    prompt="your-prompt-name-or-id",
    space="your-space-name-or-id",  # required when using a name
)

print(prompt.id, prompt.name)
print(prompt.version)

Get a Specific Version

prompt = client.prompts.get(
    prompt="your-prompt-name-or-id",
    version_id="specific-version-id",
)

Get by Label

Resolve a named label (e.g. "production") to the version it currently points to.
prompt = client.prompts.get(
    prompt="your-prompt-name-or-id",
    label="production",
)

Update a Prompt

Update a prompt’s metadata. Currently supports updating the description.
prompt = client.prompts.update(
    prompt="your-prompt-name-or-id",
    description="Updated description for this prompt",
)

print(prompt)

Delete a Prompt

Delete a prompt by name or ID. This operation is irreversible and removes all associated versions. There is no response from this call.
client.prompts.delete(prompt="your-prompt-name-or-id")

print("Prompt deleted successfully")

Manage Versions

List Versions

List all versions for a prompt in reverse-chronological order.
resp = client.prompts.list_versions(
    prompt="your-prompt-name-or-id",
    limit=50,
)

for version in resp.prompt_versions:
    print(version.id, version.commit_message)

Create a New Version

Add a new version to an existing prompt. Each version is immutable after creation.
version = client.prompts.create_version(
    prompt="your-prompt-name-or-id",
    commit_message="Improved system prompt for edge cases",
    input_variable_format=InputVariableFormat.F_STRING,
    provider=LlmProvider.OPENAI,
    model="gpt-4o",
    messages=[
        LLMMessage(role="system", content="You are an expert customer support agent for {company_name}. Be concise."),
        LLMMessage(role="user", content="{user_query}"),
    ],
)

print(version.id)

Manage Labels

Labels are mutable pointers to a specific version. Use them to decouple your application code from version IDs — update the label when you want to promote a new version without changing application code.

Get a Label

Resolve a label name to the version it currently points to.
version = client.prompts.get_label(
    prompt="your-prompt-name-or-id",
    label_name="production",
)

print(version.id, version.commit_message)

Set Labels on a Version

Assign one or more labels to a version. This replaces all existing labels on that version.
resp = client.prompts.set_labels(
    version_id="your-version-id",
    labels=["production"],
)

Promote a New Version

# Create the new version
new_version = client.prompts.create_version(
    prompt="your-prompt-name-or-id",
    commit_message="Tuned temperature and tone",
    input_variable_format=InputVariableFormat.F_STRING,
    provider=LlmProvider.OPENAI,
    model="gpt-4o",
    messages=[...],
)

# Promote it to production
client.prompts.set_labels(
    version_id=new_version.id,
    labels=["production"],
)

Delete a Label

Remove a label from a version. This does not delete the version itself.
client.prompts.delete_label(
    version_id="your-version-id",
    label_name="staging",
)