> bernard

A local CLI AI agent with multi-provider support, persistent memory, and autonomous tool use.

v0.4.0 MIT Node ≥ 20
bernard

Everything you need in a CLI agent

Bernard runs locally, talks to any major LLM provider, and comes with tools out of the box.

Switch between Anthropic, OpenAI, and xAI with a flag. One interface, any model.
bernard

Up and running in two minutes

Install globally from npm, set an API key, and start the REPL.

# Install globally
$ npm install -g bernard-agent

# Store your API key securely
$ bernard add-key anthropic sk-ant-...

# Start the REPL
$ bernard

# Ask anything
bernard> what git branch am I on?
  ▶ shell: git branch --show-current
  You're on the main branch.
01

Install

One command with npm. Requires Node.js 20 or later.

02

Configure

Set an API key for your preferred provider via environment variable or .env file.

03

Run

Launch the interactive REPL and start asking questions, running commands, and building.

What you can build with MCP

Connect MCP servers to give Bernard access to external services. Combine tools, cron jobs, and memory to build autonomous workflows.

@

Schedule a meeting end-to-end

Email a contact, check your calendar for availability, coordinate a time, create the event, and send the invite—all autonomously.

requires: google-gmail MCP + google-calendar MCP
🔧

Deploy monitoring with alerts

Watch a deploy pipeline, notify you on failure via Slack, and roll back if needed.

requires: slack MCP + github MCP

Gets smarter the more you use it

Bernard automatically builds a knowledge base from your conversations using local embeddings and RAG. No cloud storage, no data leaving your machine.

1

You work normally

Ask questions, run commands, solve problems. Bernard observes what matters—your preferences, project conventions, key decisions.

2

Facts are extracted by domain

When conversations get long, Bernard compresses history and extracts facts into three domains—tool usage patterns, user preferences, and general knowledge—each with a specialized prompt.

3

Context is recalled automatically

Each new message retrieves relevant facts per domain. Results are organized by category so Bernard knows whether it's recalling a command pattern, your preference, or a project detail.

Storage Local only — ~/.bernard/rag/
Embeddings Runs locally via fastembed, no API calls
Domains Three specialized extractors: tool-usage, user-preferences, general
Deduplication Automatic — near-duplicate facts are merged
Pruning Old, unused memories decay over time to keep context sharp

Your models, your choice

Bernard uses the Vercel AI SDK for unified access across providers. Switch with a single flag.

Help make Bernard better

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Run npm run build to verify
  4. Run npm test to check tests
  5. Submit a pull request
Browse open issues →

Bug Reports

  • Steps to reproduce the issue
  • Expected vs actual behavior
  • Environment info (OS, Node version, provider)
  • Relevant error messages or logs
File a new issue →