Skip to main content

openinference/python/instrumentation/openinference-instrumentation-promptflow/examples/chat_flow_example_to_arize.ipynb at main · Arize-ai/openinference

GitHub
This integration will allow you to trace Microsoft PromptFlow flows and send their traces into Arize.
pip install arize-otel promptflow
Import arize_otel and other dependencies required for setup.
import os
from arize.otel import register, Endpoint
from opentelemetry.sdk.environment_variables import OTEL_EXPORTER_OTLP_ENDPOINT
from promptflow.tracing._start_trace import setup_exporter_from_environ
Set up OpenTelemetry using our convenience function: register with your Arize credentials.
# Setup OTel via our convenience function
tracer_provider = register(
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    project_name = "your-project-name", # name this to whatever you would like
)
Then set up the OpenTelemetry endpoint to be Arize and use Prompt flow’s setup_exporter_from_environ to start tracing any further flows and LLM calls.
os.environ[OTEL_EXPORTER_OTLP_ENDPOINT] = Endpoint.ARIZE
setup_exporter_from_environ()
Proceed with creating Prompt flow flows as usual.