The Stream Client is used for real-time logging of model predictions, providing lower latency than batch logging.Documentation Index
Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
log()
Thelog() method migrates from client.log() to client.ml.log_stream().
Parameter Reference
This table provides a complete mapping of all parameters between v7 and v8, including which parameters were removed, renamed, or remain unchanged.| Parameter | v7 | v8 | Changes |
|---|---|---|---|
space_id | Client init | Required per call | Must pass explicitly |
model_id | Required | Required | Renamed to model_name |
model_type | Required | Required | — |
environment | Required | Required | — |
model_version | Optional | Optional | — |
prediction_id | Optional | Optional | — |
prediction_timestamp | Optional | Optional | — |
prediction_label | Optional | Optional | — |
actual_label | Optional | Optional | — |
features | Optional | Optional | — |
embedding_features | Optional | Optional | — |
shap_values | Optional | Optional | — |
tags | Optional | Optional | — |
batch_id | Optional | Optional | — |
prompt | Optional | Optional | — |
response | Optional | Optional | — |
prompt_template | Optional | Optional | — |
prompt_template_version | Optional | Optional | — |
llm_model_name | Optional | Optional | — |
llm_params | Optional | Optional | — |
llm_run_metadata | Optional | Optional | — |
timeout | N/A | ✅ Optional | New parameter for request timeout |