Prerequisites
- Node.js 18+
- @ai-sdk/openai
- @ai-sdk/anthropic
- @ai-sdk/google
- @maximai/maxim-js
- ai (Vercel AI SDK)
1. Set Up Environment Variables
Add the following to your.env
:
2. Initialize Maxim Logger
3. Wrap AI SDK Models with Maxim
4. Make LLM Calls Using Wrapped Models
5. Use with All Vercel AI SDK Functions
Generate Object
Stream Text
6. Add Custom Metadata and Tracing
You can add custom names, tags, and IDs to sessions, traces, spans, and generations.Available Metadata Fields
- Entity Naming:
sessionName
,traceName
,spanName
,generationName
- Entity Tagging:
sessionTags
,traceTags
,spanTags
,generationTags
- ID References:
sessionId
,traceId
,spanId
7. Streaming Support with Metadata
8. Multiple Provider Support
You can wrap and use models from OpenAI, Anthropic, and Google with the same interface.9. Next.js API Route Example
10. Client-side Integration Example
11. Visualize in Maxim
All requests, responses, and streaming events are automatically traced and can be viewed in your Maxim dashboard.
For more details, see the Maxim JS SDK documentation and the Vercel AI SDK documentation.