Skip to main content

What Is OTLP Ingestion?

OpenTelemetry Protocol (OTLP) is a vendor-neutral, industry-standard telemetry format for transmitting trace data. Maxim provides OTLP ingestion capabilities for both AI and LLM Observability, enabling deep insights into your AI systems.

Overview

OpenTelemetry Protocol (OTLP) is a vendor-neutral, industry-standard telemetry format for transmitting trace data. Maxim provides OTLP ingestion capabilities for both AI and LLM Observability, enabling deep insights into your AI systems.

Before you begin

Ensure you have created a Log Repository in Maxim and have your Log Repository ID ready. You can find it in the Maxim Dashboard under Logs > Repositories.

Endpoint & Protocol Configuration

Endpoint: https://api.getmaxim.ai/v1/otel Supported Protocols: HTTP with OTLP binary Protobuf or JSON
ProtocolContent-Type
HTTP + Protobuf (binary)application/x-protobuf or application/protobuf
HTTP + JSONapplication/json
Transport Security:
  • HTTPS/TLS is required.

Authentication Headers

Maxim’s OTLP endpoint requires the following headers:
  • x-maxim-repo-id: Your Maxim Log Repository ID
  • x-maxim-api-key: Your Maxim API Key
  • Content-Type: application/json, application/x-protobuf, or application/protobuf

Supported Trace Format

Maxim currently supports traces that follow the OpenTelemetry Semantic Conventions for Generative AI (specification).

Best Practices

  • Use binary Protobuf (application/x-protobuf) for optimal performance and robustness
  • Batch traces to reduce network overhead
  • Include rich attributes following GenAI semantic conventions
  • Secure your headers and avoid exposing credentials
  • Monitor attribute size limits and apply appropriate quotas

Error Codes and Responses

HTTP StatusConditionDescription
200Success{ "data": { "success": true } }
403Missing or invalid headers - x-maxim-repo-id or x-maxim-api-key{ "code": 403, "message": "Invalid access error" }

Code Examples

Python Example

from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry import trace as trace_api

# Replace with your actual Maxim API key
maxim_api_key = "your_api_key_here"
# Replace with your actual repository ID
repo_id = "your_repository_id_here"

tracer_provider = trace_sdk.TracerProvider()
span_exporter = OTLPSpanExporter(
    endpoint="https://api.getmaxim.ai/v1/otel",
    headers={
        "x-maxim-api-key": f"{maxim_api_key}",
        "x-maxim-repo-id": f"{repo_id}",
    },
)

# Register the exporter
tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter))
# Make it the global provider
trace_api.set_tracer_provider(tracer_provider)

Maxim-specific additional attributes

Maxim supports the following additional attributes to enrich your traces:
  • maxim-trace-tags: A Map<string, string> of tags that will be attached to the parent trace of the span (if the current node is a trace itself, this will be attached to the same trace).
  • maxim-tags: A Map<string, string> of tags that will be attached to the current node.
  • maxim-trace-metrics: A Map<string, number> of metrics that will be attached to the parent trace of the span (if the current node is a trace itself, this will be attached to the same trace).
  • maxim-metrics: A Map<string, number> of metrics that will be attached to the current node.
For OpenInference, the above tags should be present in the metadata field of the OpenInference log line. Example shown below:
    with using_attributes(
        session_id="my-test-session",
        user_id="my-test-user",
        metadata={
            "test-int": 1,
            "test-str": "string",
            "test-list": [1, 2, 3],
            "test-dict": {
                "key-1": "val-1",
                "key-2": "val-2",
            },
            "maxim-trace-tags": {
                "tag-1": "val-1",
                "tag-2": "val-2",
            },
            "maxim-tags": {
                "tag-3": "val-3",
            },
            "maxim-metrics": {
                "metric-1": 1,
                "metric-2": 2,
            },
            "maxim-trace-metrics": {
                "trace-metric-1": 1,
                "trace-metric-2": 2,
            },
        },
        tags=["tag-1", "tag-2"],
        prompt_template="Who won the soccer match in {city} on {date}",
        prompt_template_version="v1.0",
        prompt_template_variables={
            "city": "Johannesburg",
            "date": "July 11th",
        },
    ):