Use this file to discover all available pages before exploring further.
openinference-instrumentation-annotation adds annotation-driven tracing to any Java application. Annotate your methods with @Chain, @LLM, @Tool, @Agent, or @Span and a ByteBuddy agent intercepts each call at class load to produce OpenInference spans backed by OpenTelemetry. Use it when you’ve built an agent or pipeline by hand and want OpenInference semantics without a framework-specific instrumentor.
plugins { id 'application'}repositories { mavenCentral()}dependencies { // OpenInference annotation instrumentor + semantic conventions implementation 'com.arize:openinference-instrumentation-annotation:0.1.2' implementation 'com.arize:openinference-semantic-conventions:0.1.12' // OpenTelemetry SDK + OTLP exporter implementation 'io.opentelemetry:opentelemetry-sdk:1.50.0' implementation 'io.opentelemetry:opentelemetry-exporter-otlp:1.50.0'}// Required: emits parameter names into the class file so the// annotation agent can attach them as span attributes. Without this,// parameters appear as `arg0`, `arg1`, ... in your spans.compileJava { options.compilerArgs += '-parameters'}application { mainClass = 'example.Main'}
The annotation artifact uses standard hyphenated naming — openinference-instrumentation-annotation — unlike openinference-instrumentation-springAI which retains camelCase on Maven Central.
The annotation instrumentor needs three things at startup: the ByteBuddy agent installed before any annotated class is loaded, an OITracer wrapping an OpenTelemetry tracer, and that OITracer registered with the OpenInferenceAgent so the intercepted methods know where to emit spans.
// src/main/java/example/Main.javapackage example;import com.arize.instrumentation.OITracer;import com.arize.instrumentation.OpenInferenceAgent;import com.arize.instrumentation.annotation.OpenInferenceAgentInstaller;import io.opentelemetry.api.common.AttributeKey;import io.opentelemetry.api.common.Attributes;import io.opentelemetry.api.trace.propagation.W3CTraceContextPropagator;import io.opentelemetry.context.propagation.ContextPropagators;import io.opentelemetry.exporter.otlp.trace.OtlpGrpcSpanExporter;import io.opentelemetry.sdk.OpenTelemetrySdk;import io.opentelemetry.sdk.resources.Resource;import io.opentelemetry.sdk.trace.SdkTracerProvider;import io.opentelemetry.sdk.trace.export.BatchSpanProcessor;import java.time.Duration;import java.util.Map;import java.util.concurrent.TimeUnit;public class Main { public static void main(String[] args) throws Exception { // CRITICAL: install the ByteBuddy agent BEFORE the JVM loads any // annotated class. We trigger QAService loading further down inside // main(), so installing here (as the first statement) is in time. OpenInferenceAgentInstaller.install(); String apiKey = System.getenv("ARIZE_API_KEY"); String spaceId = System.getenv("ARIZE_SPACE_ID"); String project = System.getenv().getOrDefault( "ARIZE_PROJECT_NAME", "annotation-tracing-example"); // Resource: service name + Arize project name (the latter is what // makes the trace appear under the right project in Arize). Resource resource = Resource.getDefault().merge(Resource.create( Attributes.of( AttributeKey.stringKey("service.name"), "annotation", AttributeKey.stringKey("openinference.project.name"), project))); // OTLP gRPC exporter pointed at Arize. OtlpGrpcSpanExporter exporter = OtlpGrpcSpanExporter.builder() .setEndpoint("https://otlp.arize.com:443") .setHeaders(() -> Map.of( "authorization", apiKey, "arize-space-id", spaceId, "arize-interface", "java")) .build(); SdkTracerProvider tracerProvider = SdkTracerProvider.builder() .addSpanProcessor(BatchSpanProcessor.builder(exporter) .setScheduleDelay(Duration.ofSeconds(1)) .build()) .setResource(resource) .build(); OpenTelemetrySdk.builder() .setTracerProvider(tracerProvider) .setPropagators(ContextPropagators.create( W3CTraceContextPropagator.getInstance())) .buildAndRegisterGlobal(); // The OITracer wraps the SDK tracer with OpenInference semantic // attribute handling. Register it on the global agent so every // annotated method routes its span through it. OITracer tracer = new OITracer( tracerProvider.get("com.arize.annotation")); OpenInferenceAgent.register(tracer); System.out.println("Arize AX tracing initialized for Annotations."); // First reference to QAService — class loading happens here, // safely AFTER OpenInferenceAgentInstaller.install() above. QAService service = new QAService(); String answer = service.answer( "Why is the ocean salty? Answer in two sentences."); System.out.println(answer); // Force flush + shutdown — without this, the JVM may exit before // the BatchSpanProcessor delivers its queue and spans get dropped. tracerProvider.forceFlush().join(10, TimeUnit.SECONDS); tracerProvider.shutdown().join(10, TimeUnit.SECONDS); OpenInferenceAgent.unregister(); }}
Parameters are automatically captured as input.value and the return value as output.value. Use @ExcludeFromSpan on a parameter to drop it from the input attribute, and @SpanMapping to map a parameter or field to a specific OpenInference semantic-convention attribute.
Annotate your service methods with the span kind that fits each step, then run it. The agent intercepts at class load — there’s nothing further to wire up beyond the annotations themselves.
// src/main/java/example/QAService.javapackage example;import com.arize.instrumentation.annotation.Agent;import com.arize.instrumentation.annotation.Chain;import com.arize.instrumentation.annotation.ExcludeFromSpan;import com.arize.instrumentation.annotation.LLM;import com.arize.instrumentation.annotation.Tool;import java.net.URI;import java.net.http.HttpClient;import java.net.http.HttpRequest;import java.net.http.HttpResponse;import java.time.Duration;import java.util.Map;import java.util.regex.Matcher;import java.util.regex.Pattern;public class QAService { private static final HttpClient HTTP = HttpClient.newBuilder() .connectTimeout(Duration.ofSeconds(10)) .build(); @Agent(name = "qa-agent") public String answer(String question) { String context = retrieve(question); Map<String, Object> weather = getWeather("San Francisco"); return generate(question, context, weather); } @Chain(name = "retriever") public String retrieve(String query) { return "OpenInference is an open standard for AI tracing."; } @Tool(name = "weather", description = "Gets current weather for a location") public Map<String, Object> getWeather(String location) { return Map.of("temp", 68, "condition", "foggy", "location", location); } @LLM(name = "generator") public String generate(String question, String context, @ExcludeFromSpan Map<String, Object> weather) { // @ExcludeFromSpan keeps the weather parameter out of the // captured input so the LLM span doesn't pick up unrelated // metadata that the language model never sees. String prompt = "Use this context to answer the question.\n" + "Context: " + context + "\n\n" + "Question: " + question; return openAiChatCompletion(prompt); } // Plain HTTPS call to OpenAI — no SDK with its own observability // hooks, so the only spans Arize sees are the annotation-driven // ones from @Agent / @Chain / @Tool / @LLM above. private String openAiChatCompletion(String prompt) { try { String body = "{\"model\":\"gpt-5\"," + "\"messages\":[{\"role\":\"user\",\"content\":\"" + jsonEscape(prompt) + "\"}]}"; HttpRequest req = HttpRequest.newBuilder() .uri(URI.create("https://api.openai.com/v1/chat/completions")) .timeout(Duration.ofMinutes(3)) .header("Authorization", "Bearer " + System.getenv("OPENAI_API_KEY")) .header("Content-Type", "application/json") .POST(HttpRequest.BodyPublishers.ofString(body)) .build(); HttpResponse<String> resp = HTTP.send(req, HttpResponse.BodyHandlers.ofString()); if (resp.statusCode() != 200) { throw new RuntimeException("OpenAI HTTP " + resp.statusCode() + ": " + resp.body()); } return extractContent(resp.body()); } catch (Exception e) { throw new RuntimeException(e); } } // Minimal extractor — pulls "content":"..." out of the response // body. Replace with Jackson / Gson in production code. private static String extractContent(String json) { Matcher m = Pattern.compile( "\"content\"\\s*:\\s*\"((?:\\\\.|[^\"\\\\])*)\"") .matcher(json); if (!m.find()) { throw new RuntimeException("no content in: " + json); } return m.group(1) .replace("\\n", "\n") .replace("\\\"", "\"") .replace("\\\\", "\\"); } private static String jsonEscape(String s) { return s.replace("\\", "\\\\") .replace("\"", "\\\"") .replace("\n", "\\n"); }}
Calling service.answer("Why is the ocean salty? Answer in two sentences.") produces a nested trace:
Arize AX tracing initialized for Annotations.The ocean is salty because rivers continuously dissolve mineral salts from rocks and soil and carry them to the sea, where they accumulate over millions of years. Water leaves the ocean through evaporation but the salts remain, steadily concentrating until reaching today's roughly 3.5% salinity.
Open your Arize AX space and select project annotation-tracing-example.
You should see a new trace within ~30–60 seconds (Arize’s Java OTLP ingest is slightly slower than the Python path) with the four-span hierarchy shown above. Click the root qa-agent span to inspect captured inputs and outputs.
No spans, but OpenInferenceAgentInstaller.install() ran. The ByteBuddy agent only rewrites classes loaded after it installs. Make sure install() is the first statement in main, and that you don’t reference any annotated class (directly or transitively via imports) before that line. The example above defers QAService loading until after install() by only naming QAService inside main.
Parameters appear as arg0, arg1, … in span attributes. The compiler stripped parameter names. Add compileJava { options.compilerArgs += '-parameters' } to your build.gradle (already in the Install snippet), or fall back to @SpanMapping(parameter = "arg0", ...) to reference the generated names.
No traces in Arize. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs gradle run. To confirm spans are being produced locally before troubleshooting export, add SimpleSpanProcessor.create(LoggingSpanExporter.create()) as an extra processor — it prints every span to stderr.
401 from OpenAI. Verify OPENAI_API_KEY is set and has access to gpt-5. Swap "gpt-5" in the openAiChatCompletion body for a model your key can call.
Spans dropped at JVM exit.BatchSpanProcessor exports asynchronously. Always tracerProvider.forceFlush().join(...) and tracerProvider.shutdown().join(...) before main returns.
Lost spans across thread boundaries. The ByteBuddy agent wraps each annotated method on the calling thread. OpenTelemetry context does not automatically follow execution across CompletableFuture, ExecutorService, reactive frameworks (Reactor, RxJava, Mutiny), or coroutines. Propagate context explicitly with io.opentelemetry.context.Context.current().wrap(...) when handing work to another thread, or fall back to the programmatic span API where you control span lifetimes directly.
Hiding sensitive fields. Pass a TraceConfig when constructing the OITracer to suppress inputs / outputs / tool parameters: new OITracer(provider.get("..."), TraceConfig.builder().hideInputs(true).hideOutputs(true).build()).