A skill system for pydantic-ai agents, implementing the Agent Skills specification.
Load skills from SKILL.md directories, Python entrypoints, or MCP servers and attach them to your agent as a SkillsCapability or SkillToolset. Skills run as isolated sub-agents by default, keeping your main agent's tool space clean regardless of how many skills you load.
uv add haiku.skillsfrom pydantic_ai import Agent
from haiku.skills import SkillsCapability
agent = Agent(
"anthropic:claude-sonnet-4-5-20250929",
capabilities=[SkillsCapability(use_entrypoints=True)],
)
result = await agent.run("Analyze this dataset.")
print(result.output)Or with explicit control over the toolset:
from pathlib import Path
from pydantic_ai import Agent
from haiku.skills import SkillToolset, build_system_prompt
toolset = SkillToolset(skill_paths=[Path("./skills")])
agent = Agent(
"anthropic:claude-sonnet-4-5-20250929",
instructions=build_system_prompt(toolset.skill_catalog),
toolsets=[toolset],
)Two execution modes:
- Sub-agent mode (default): exposes a single
execute_skilltool. Each invocation spins up a focused sub-agent with only that skill's instructions, tools, and token budget. The main agent never sees skill internals. - Direct mode (
use_subagents=False): exposes skill tools directly to the main agent. No sub-agent LLM loops, lower latency, and the agent retains tool results in its conversation context.
Skills are discovered automatically and can come from three sources:
- Filesystem: directories containing a SKILL.md file with scripts in Python, JavaScript, TypeScript, or shell
- Entrypoints: Python packages that register via the
haiku.skillsentry_points group, with typed in-process tools, per-skill Pydantic state, and zero-config discovery - MCP servers: wrap any MCP server as a skill with
skill_from_mcp
- Per-skill state: Pydantic state models tracked per namespace; state deltas emitted as JSON Patch for the AG-UI protocol
- AG-UI streaming: skill tool calls and state changes stream as
ActivitySnapshotEventandStateDeltaEventfor real-time UIs - Signing and verification: identity-based skill signing via sigstore, verified at discovery time
- CLI:
haiku-skills list,validate,sign,verify, and an interactivechatTUI
The repo ships several skill packages as references and for immediate use:
| Skill | Description |
|---|---|
web |
Search (Brave) and fetch web pages with readability extraction |
code-execution |
Sandboxed Python execution via Monty |
image-generation |
Image generation with await llm() support |
gmail |
Gmail integration |
notifications |
System notifications |
Each is a standalone package: uv add haiku-skills-web, uv add haiku-skills-code-execution, etc. They register via the haiku.skills entrypoints group and are discovered automatically.
Full documentation at ggozad.github.io/haiku.skills.
MIT