aimock — mock everything your AI app talks to

Deterministic mock infrastructure for AI apps

Start with LLM mocking. Add MCP, A2A, vector, and more when you need them. The aimock suite grows with your stack.

$ npm install @copilotkit/aimock
fixture.json
{
  "match": {
    "userMessage": "Hello"
  },
  "response": {
    "content": "Hi there! How can I help?"
  },
  "opts": {
    "chunkSize": 10,
    "latency": 1000
  }
}
terminal
Record & Replay

From zero to fixtures in one command

1

Record

Proxy unmatched requests to real APIs and capture every response.

2

Save

Fixtures written to disk automatically as clean, editable JSON.

3

Replay

Deterministic responses in CI, forever. No API keys, no flakiness.

Learn about Record & Replay →
terminal
$ npx aimock --record --provider-openai https://api.openai.com

 Listening on http://localhost:4010

 NO FIXTURE MATCH — proxying to
  https://api.openai.com/v1/chat/completions

 Recorded → fixtures/recorded/openai-2026-03-31T22:15:00.json

 Fixture match — replaying from disk

Your AI app talks to more than just LLMs.
aimock mocks all of them.

One JSON config. One port. Every service your AI app depends on.

aimock.json
{
  "llm": {
    "fixtures": "./fixtures/llm",
    "providers": ["openai", "claude", "gemini"]
  },
  "mcp": {
    "tools": "./fixtures/mcp/tools.json",
    "resources": "./fixtures/mcp/resources.json"
  },
  "a2a": {
    "agents": "./fixtures/a2a/agents.json"
  },
  "vector": {
    "provider": "pinecone",
    "fixtures": "./fixtures/vector"
  }
}
terminal
$ npx aimock --config aimock.json

 aimock v1.0.0

 LLM    mounted at  /v1/chat/completions
 LLM    mounted at  /v1/messages
 LLM    mounted at  /v1/embeddings
 MCP    mounted at  /mcp/tools/*
 A2A    mounted at  /a2a/agents/*
 Vector mounted at  /vectors/*

 Listening on http://localhost:4010
   6 services · 24 fixtures loaded

Everything you need

📡

Every Major LLM Provider

OpenAI, Claude, Gemini, Bedrock, Azure, Vertex AI, Ollama, Cohere — full streaming and embeddings support for every provider.

🔌

MCP Protocol

Mock tools, resources, and prompts with full session management. Test your MCP integrations without running real tool servers.

🤝

A2A Protocol

Agent cards, message routing, and SSE streaming. Mock multi-agent interactions with deterministic responses.

📦

Vector Databases

Pinecone, Qdrant, and ChromaDB compatible. Mock similarity search, upserts, and index operations with fixtures.

💥

Chaos Testing

Drop, malformed, or disconnect at any probability. Verify your app gracefully handles every failure mode.

📊

Drift Detection

Daily CI validation against real APIs. Know immediately when provider behavior changes break your fixtures.

Verified against real APIs, every day

aimock's drift detection runs daily against live provider APIs. When response formats change, you know immediately — not when your tests break in production.

1

Real API calls

Daily CI hits actual OpenAI, Anthropic, Gemini endpoints to capture current response formats.

2

Response validation

Compares real responses against aimock's fixture format. Schema changes are caught instantly.

3

Auto-remediation

Drift detected → PR opened → fixtures, skills, and docs updated automatically. Zero manual effort.

How aimock compares

Capability aimock MSW VidaiMock mock-llm piyook/llm-mock
Cross-process interception Real server ✓ In-process only (Docker)
Chat Completions SSE Built-in ✓ manual
Responses API SSE Built-in ✓ manual
Claude Messages API Built-in ✓ manual
Gemini streaming Built-in ✓ manual
WebSocket APIs Built-in ✓
Multi-provider support 11 providers ✓ manual 11 providers OpenAI only OpenAI only
Embeddings API Built-in ✓
Structured output / JSON mode Built-in ✓ manual
Sequential / stateful responses Built-in ✓ manual
Fixture files JSON ✓ Code-only Tera templates YAML config JSON templates
Programmatic API (TypeScript/JS) (TypeScript/JS) No (binary only)
Request journal manual
Error injection partial
Docker + Helm Both ✓ Docker only (Both) Docker only
Drift detection
Chaos testing Built-in ✓
Record & replay Built-in ✓
Prometheus metrics Built-in ✓
Streaming physics Built-in ✓
MCP tool mocking Built-in ✓
A2A agent mocking Built-in ✓
Vector DB mocking Built-in ✓
Search & rerank Built-in ✓
Dependencies Zero ✓ ~300KB Zero (Rust) Node+Express Minimal

Ready to switch? We got you.

Step-by-step migration guides for every major mock tool.

Built for production

AG-UI uses aimock for its end-to-end test suite, verifying AI agent behavior across LLM providers with fixture-driven responses in the codebase.