Switching from Mokksy to aimock
Mokksy (AI-Mocks) is a solid Kotlin/Ktor-based mock for JVM teams. aimock gives you the same LLM mocking from any language—plus MCP, A2A, vector databases, record-and-replay, and drift detection that Mokksy doesn't have.
The quick switch
JVM tests can use aimock via Docker or npx aimock if Node.js is available.
Point the OpenAI Java SDK at aimock and your existing test assertions stay the same.
// JUnit 5 — start aimock before tests
@BeforeAll
fun setup() {
ProcessBuilder("docker", "run", "-d", "--name", "aimock",
"-p", "4010:4010", "-v", "./fixtures:/fixtures",
"ghcr.io/copilotkit/aimock", "-f", "/fixtures")
.start().waitFor()
}
@AfterAll
fun teardown() {
ProcessBuilder("docker", "stop", "aimock").start().waitFor()
ProcessBuilder("docker", "rm", "aimock").start().waitFor()
}
// Point OpenAI Java SDK at aimock
val client = OpenAIClient.builder()
.baseUrl("http://localhost:4010/v1")
.apiKey("mock")
.build()
What you gain
Cross-language testing
Kotlin, Python, TypeScript, Go, Rust—any language that speaks HTTP can use the same mock server with the same fixtures.
MCP / A2A / Vector mocking
Mock your entire AI stack: MCP tool servers, A2A agent protocols, and vector databases—not just LLM completions.
Record & replay
Proxy real APIs, save responses as JSON fixtures, replay deterministically forever. No manual response construction.
More providers (11 vs 5)
OpenAI, Claude, Gemini, Bedrock, Azure, Vertex AI, Ollama, Cohere, and more. Mokksy covers OpenAI, Anthropic, Google, Ollama, and MistralAI.
Drift detection
Automatically detect when real provider APIs diverge from your mocked responses. Know when your tests are lying to you.
Chaos testing
Inject latency, errors, partial failures, and rate limits. Verify your app handles degraded AI services gracefully.
Docker + Helm native
First-class Docker images and Helm charts. Drop aimock into any CI pipeline or Kubernetes cluster without JVM dependencies.
What you lose (honestly)
-
Native Kotlin DSL — Mokksy's
mockLLM { }builder is elegant. With aimock you configure via HTTP/JSON fixtures or the Docker CLI. - In-process JVM mock — Mokksy embeds inside your test process. aimock runs as a separate server (localhost or container).
- Kotlin coroutine integration — Mokksy's streaming is backed by Ktor and coroutines natively. aimock streams over plain HTTP SSE.
- Ktor-specific test helpers — If your app uses Ktor's test engine, Mokksy plugs in directly. aimock requires a network call.
Comparison table
| Capability | Mokksy | aimock |
|---|---|---|
| Language | Kotlin/JVM only | Any (HTTP-based) |
| In-process mock | ✓ | ✗ |
| Cross-language / cross-process | ✗ | ✓ |
| LLM providers | 5 (OpenAI, Anthropic, Google, Ollama, MistralAI) | 11+ |
| Streaming SSE | ✓ | ✓ |
| WebSocket APIs | ✗ | ✓ |
| MCP mock | ✗ | ✓ |
| A2A mock | ✗ | ✓ |
| Vector DB mock | ✗ | ✓ |
| Record & replay | ✗ | ✓ |
| Drift detection | ✗ | ✓ |
| Chaos testing | ✗ | ✓ |
| Docker / Helm | ✗ | ✓ |
| Kotlin DSL | ✓ | ✗ |
| Coroutine integration | ✓ | ✗ |
TestContainers integration
For a cleaner lifecycle, use TestContainers to manage the aimock container automatically.
// Using TestContainers for cleaner lifecycle
class AIMockContainer : GenericContainer<AIMockContainer>("ghcr.io/copilotkit/aimock") {
init {
withExposedPorts(4010)
withFileSystemBind("./fixtures", "/fixtures")
withCommand("-f", "/fixtures")
waitingFor(Wait.forHttp("/health").forPort(4010))
}
}
@Testcontainers
class MyAgentTest {
companion object {
@Container
val aimock = AIMockContainer()
}
@Test
fun `agent responds with mocked content`() {
val baseUrl = "http://\${aimock.host}:\${aimock.getMappedPort(4010)}/v1"
val client = OpenAIClient.builder()
.baseUrl(baseUrl)
.apiKey("mock")
.build()
// ... your test assertions
}
}
CLI / Docker quick start
npx aimock -p 4010 -f ./fixtures
docker run -d -p 4010:4010 \
-v $(pwd)/fixtures:/fixtures \
ghcr.io/copilotkit/aimock:latest \
-f /fixtures