Switching from VidaiMock to aimock
VidaiMock is a capable Rust binary with broad provider support. aimock matches that coverage and adds what VidaiMock can’t — a programmatic TypeScript API, WebSocket support, fixture files, request journal, and MCP/A2A/Vector mocking.
The quick switch
The CLI flags map directly. Swap the binary name, adjust the flag names, and go:
vidaimock --port 4010 --template-dir ./templates
npx aimock -p 4010 -f ./fixtures
docker run -d -p 4010:4010 \
-v ./fixtures:/fixtures \
ghcr.io/copilotkit/aimock \
npx aimock -p 4010 -f /fixtures
Fixture format comparison
VidaiMock uses Tera templates (.tera) to build responses dynamically. aimock
uses plain JSON fixture files — no template language, no compilation step.
{
"model": "{{ model }}",
"choices": [{
"message": {
"role": "assistant",
"content": "Hello from VidaiMock"
}
}]
}
{
"match": { "userMessage": "hello" },
"response": { "content": "Hello from aimock" }
}
aimock fixtures are declarative: you specify what to match and what to return. The server handles model names, streaming format, and provider-specific envelope generation automatically.
What you gain
Programmatic API
Import LLMock in TypeScript/JavaScript and configure fixtures,
assertions, and lifecycle in code — no process spawning needed.
WebSocket APIs
Built-in support for OpenAI Realtime, OpenAI Responses WS, and Gemini Live — three WebSocket protocols out of the box.
Request journal
Every request is recorded with full headers, body, matched fixture, and timing.
getRequests() lets you assert on what your app actually sent.
MCP / A2A / Vector
Mock MCP tool servers, A2A agent-to-agent endpoints, and vector database APIs alongside your LLM mocks on one port.
Record & replay
Proxy to real APIs, capture responses as fixtures, then replay deterministically in CI. No manual fixture authoring required.
Drift detection
Automated three-way conformance testing against live provider APIs. Catch breaking changes before your users do.
Structured output
JSON mode and response_format matching for structured responses —
match on schema, return typed JSON.
Sequential responses
Return different responses on successive calls to the same endpoint. Model multi-turn conversations and retry scenarios.
Comparison table
| Capability | VidaiMock | aimock |
|---|---|---|
| Programmatic API | No (binary only) | Yes (TypeScript/JS) |
| WebSocket APIs | No | Built-in (3 protocols) |
| Request journal | No | Built-in |
| MCP / A2A / Vector | No | Built-in |
| Record & replay | No | Built-in |
| Drift detection | No | Automated CI |
| LLM providers | 11+ | 10+ |
| Prometheus metrics | Yes | Yes |
| Chaos testing | Partial | Built-in (3 modes) |
| Docker | Yes | Yes |
| Streaming physics | Yes | Built-in (ttft/tps/jitter) |
| Dependencies | Zero (Rust) | Zero (Node.js builtins) |
CLI / Docker quick start
# Run the mock server
npx aimock -p 4010 -f ./fixtures
# Point your app at it
export OPENAI_BASE_URL=http://localhost:4010/v1
export OPENAI_API_KEY=mock-key
# Full config-driven setup (LLM + MCP + A2A on one port)
npx aimock --config aimock.json --port 4010
# Pull and run from GitHub Container Registry
docker pull ghcr.io/copilotkit/aimock:latest
docker run -p 4010:4010 \
-v $(pwd)/fixtures:/fixtures \
ghcr.io/copilotkit/aimock
# With a config file
docker run -p 4010:4010 \
-v $(pwd)/aimock.json:/app/aimock.json \
-v $(pwd)/fixtures:/app/fixtures \
ghcr.io/copilotkit/aimock aimock --config /app/aimock.json --host 0.0.0.0
services:
aimock:
image: ghcr.io/copilotkit/aimock:latest
command: aimock --config /app/aimock.json --host 0.0.0.0
ports:
- "4010:4010"
volumes:
- ./aimock.json:/app/aimock.json:ro
- ./fixtures:/app/fixtures:ro
app:
build: .
environment:
OPENAI_BASE_URL: http://aimock:4010/v1
OPENAI_API_KEY: mock-key
depends_on:
- aimock