Haystack is a Python framework from deepset for building production-grade LLM pipelines — RAG, agents, search, and document processing. Arize AX captures every Haystack pipeline run — each component invocation, prompt construction, LLM call, and the data flowing between them — via theDocumentation Index
Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
openinference-instrumentation-haystack package.
Prerequisites
- Python 3.10+
- An Arize AX account (sign up)
- An
OPENAI_API_KEYfrom the OpenAI Platform
Launch Arize AX
- Sign in to your Arize AX account.
- From Space Settings, copy your Space ID and API Key. You will set them as
ARIZE_SPACE_IDandARIZE_API_KEYbelow.
Install
Configure credentials
Setup tracing
Run Haystack
Expected output
Verify in Arize AX
- Open your Arize AX space and select project
haystack-tracing-example. - You should see a new trace within ~30 seconds containing a Haystack
Pipeline.runparent span wrappingPromptBuilder.runandOpenAIGenerator.runchild spans, with the prompt, response, and token usage attached. (You may also see a one-timehaystack.tracing.auto_enableinitialization span on the first call.) - If no traces appear, see Troubleshooting.
Troubleshooting
- No traces in Arize AX. Confirm
ARIZE_SPACE_IDandARIZE_API_KEYare set in the same shell that runsexample.py. Enable OpenTelemetry debug logs withexport OTEL_LOG_LEVEL=debugand re-run. - Pipeline ran but no spans appear.
HaystackInstrumentor().instrument(...)must run before anyfrom haystack import .... Make sureinstrumentation.pyis the first import in your entry point. 401from OpenAI. VerifyOPENAI_API_KEYis set and has access togpt-5. Swap for a model your key can call.- Components used outside a Pipeline. The instrumentor primarily traces Haystack
Pipeline.run. If you call individual components directly, you’ll get LLM-call spans (when the underlying client is also instrumented) but not the pipeline-level trace.