Mastra
Test your Mastra agents with deterministic LLM responses and AG-UI event streams. aimock integrates natively with CopilotKit's Mastra support.
Quick Start
Mastra agents support multiple LLM providers through its model configuration. For
OpenAI-compatible models, set the base URL to point at aimock. The simplest approach is
setting the OPENAI_BASE_URL environment variable.
import { Agent } from "@mastra/core/agent";
const agent = new Agent({
name: "my-agent",
instructions: "You are a helpful assistant",
model: {
provider: "OPEN_AI",
name: "gpt-4o",
toolChoice: "auto",
},
});
// Set OPENAI_BASE_URL=http://localhost:4010/v1 to redirect to aimock
// All completions will be served from aimock fixtures
With Vitest Plugin
The useAimock plugin starts and stops the aimock server automatically around
your test suite. No manual setup or teardown required.
import { describe, it, expect } from "vitest";
import { useAimock } from "@copilotkit/aimock/vitest";
import { Agent } from "@mastra/core/agent";
const mock = useAimock({ fixtures: "./fixtures" });
describe("Mastra agent", () => {
it("handles a travel planning request", async () => {
process.env.OPENAI_BASE_URL = mock().url + "/v1";
const agent = new Agent({
name: "travel-planner",
instructions: "You are a travel planning assistant",
model: { provider: "OPEN_AI", name: "gpt-4o" },
});
const result = await agent.generate("plan a trip to Tokyo");
expect(result.text).toContain("Tokyo");
});
});
Tool Call Workflows
Mastra agents use tools extensively for travel planning, API calls, and multi-step workflows. Define tool call fixtures to test these interactions deterministically.
{
"fixtures": [
{
"match": { "userMessage": "plan a trip", "sequenceIndex": 0 },
"response": {
"toolCalls": [
{
"name": "search_flights",
"arguments": "{\"origin\":\"SFO\",\"destination\":\"NRT\",\"date\":\"2025-03-15\"}"
}
]
}
},
{
"match": { "userMessage": "plan a trip", "sequenceIndex": 1 },
"response": {
"toolCalls": [
{
"name": "search_hotels",
"arguments": "{\"city\":\"Tokyo\",\"checkIn\":\"2025-03-15\",\"checkOut\":\"2025-03-22\"}"
}
]
}
},
{
"match": { "userMessage": "plan a trip", "sequenceIndex": 2 },
"response": {
"content": "I found a great itinerary for your Tokyo trip!\n\n**Flight:** SFO → NRT on March 15, departing 11:30 AM (United UA837) — $890 round trip\n\n**Hotel:** Hotel Gracery Shinjuku, March 15–22 — $185/night\n\nWould you like me to book these, or would you prefer different options?"
}
}
]
}
Each fixture fires once in sequence using sequenceIndex. The first call
triggers a flight search, the second searches for hotels, and the third returns the
combined itinerary. See Fixtures for the full matching syntax.
AG-UI Frontend Testing
When using Mastra with CopilotKit, the frontend receives AG-UI event streams. Use
AGUIMock to test the frontend separately from the agent — no running Mastra
server required.
import { LLMock } from "@copilotkit/aimock";
import { AGUIMock } from "@copilotkit/aimock";
const llm = new LLMock();
const agui = new AGUIMock();
agui.onMessage("hello", "Hi from the agent!");
agui.onToolCall(/search/, "web_search", '{"q":"test"}', { result: "[]" });
llm.mount("/agui", agui);
const url = await llm.start();
// Point your CopilotKit frontend at url + "/agui"
// It receives deterministic AG-UI SSE event streams
This lets you develop and test CopilotKit UI components against canned agent responses without running Mastra or any LLM provider. See AGUIMock for the full API.
CI with GitHub Action
Run your Mastra agent tests in CI with a single step. The aimock GitHub Action starts the server, waits for it to be healthy, and cleans up when the job finishes.
steps:
- uses: actions/checkout@v4
- uses: CopilotKit/aimock@v1
with:
fixtures: ./fixtures
- run: pnpm test
env:
OPENAI_BASE_URL: http://127.0.0.1:4010/v1
See GitHub Action for all available inputs and outputs.