A local CLI AI agent with multi-provider support, persistent memory, and autonomous tool use.
Bernard runs locally, talks to any major LLM provider, and comes with tools out of the box.
Install globally from npm, set an API key, and start the REPL.
One command with npm. Requires Node.js 20 or later.
Set an API key for your preferred provider via environment variable or .env file.
Launch the interactive REPL and start asking questions, running commands, and building.
Connect MCP servers to give Bernard access to external services. Combine tools, cron jobs, and memory to build autonomous workflows.
Email a contact, check your calendar for availability, coordinate a time, create the event, and send the invite—all autonomously.
requires: google-gmail MCP + google-calendar MCPWatch a deploy pipeline, notify you on failure via Slack, and roll back if needed.
requires: slack MCP + github MCPBernard automatically builds a knowledge base from your conversations using local embeddings and RAG. No cloud storage, no data leaving your machine.
Ask questions, run commands, solve problems. Bernard observes what matters—your preferences, project conventions, key decisions.
When conversations get long, Bernard compresses history and extracts facts into three domains—tool usage patterns, user preferences, and general knowledge—each with a specialized prompt.
Each new message retrieves relevant facts per domain. Results are organized by category so Bernard knows whether it's recalling a command pattern, your preference, or a project detail.
~/.bernard/rag/
tool-usage, user-preferences,
general
Bernard uses the Vercel AI SDK for unified access across providers. Switch with a single flag.
npm run build to verifynpm test to check tests