aimock CLI

aimock is the full-stack mock orchestrator. Where llmock serves LLM endpoints only, aimock reads a JSON config file and serves LLM mocks alongside additional mock services (MCP, A2A, vector stores) on a single port.

aimock vs llmock

Capability llmock CLI aimock CLI
LLM mock endpoints Yes Yes
Additional mock services No Yes (via mount)
Config file CLI flags only JSON config file
Single-port routing LLM paths only All services on one port

Quick Start

Run aimock shell
$ npx aimock --config aimock.json --port 4010
Run aimock shell
$ docker run -d -p 4010:4010 \
  -v ./aimock.json:/config.json \
  -v ./fixtures:/fixtures \
  ghcr.io/copilotkit/aimock \
  npx aimock --config /config.json --port 4010

Config File Format

The config file is a JSON object describing which services to run and how to configure them. The llm section configures the core LLMock server. Additional services are mounted at path prefixes.

aimock.json json
{
  "llm": {
    "fixtures": "./fixtures",
    "latency": 0,
    "chunkSize": 20,
    "logLevel": "info",
    "validateOnLoad": true,
    "metrics": true,
    "strict": false
  },
  "services": {
    "/mcp": {
      "type": "mcp",
      "tools": "./mcp-tools.json"
    },
    "/a2a": {
      "type": "a2a",
      "agents": "./a2a-agents.json"
    }
  }
}

Config Fields

Field Type Description
llm object LLMock configuration. Accepts fixtures, latency, chunkSize, logLevel, validateOnLoad, metrics, strict, chaos, streamingProfile.
services object Map of mount paths to service configs. Each key is a URL path prefix (e.g. /mcp), each value describes the service type and its options.

CLI Flags

Option Default Description
--config aimock.json Path to JSON config file
--port 4010 Port to listen on (overrides config)
--host 127.0.0.1 Host to bind to (overrides config)
--help Show help

Single-Port Routing

All services share one port. Requests are routed by path prefix. LLM endpoints live at the root, mounted services at their configured prefix:

Path Service
/v1/chat/completions LLMock (OpenAI Chat Completions)
/v1/messages LLMock (Anthropic Claude)
/v1/embeddings LLMock (Embeddings)
/mcp/* MCP mock service
/a2a/* A2A mock service
/health Unified health check (all services)
/metrics Prometheus metrics (if enabled)

Path stripping is automatic — a request to /mcp/tools/list arrives at the MCP service as /tools/list.

Docker Usage

Run with config shell
$ npx aimock --config aimock.json --host 0.0.0.0
Docker run with config shell
# Mount config and fixtures into the container
$ docker run -p 4010:4010 \
  -v ./aimock.json:/config.json \
  -v ./fixtures:/fixtures \
  ghcr.io/copilotkit/aimock \
  npx aimock --config /config.json --host 0.0.0.0

Docker Compose

docker-compose.yml yaml
services:
  aimock:
    image: ghcr.io/copilotkit/aimock:latest
    command: aimock --config /app/aimock.json --host 0.0.0.0
    ports:
      - "4010:4010"
    volumes:
      - ./aimock.json:/app/aimock.json:ro
      - ./fixtures:/app/fixtures:ro

  app:
    build: .
    environment:
      OPENAI_BASE_URL: http://aimock:4010/v1
      MCP_SERVER_URL: http://aimock:4010/mcp
    depends_on:
      - aimock