Getting Started with LibreFang

This guide walks you through installing LibreFang, configuring your first LLM provider, spawning an agent, and chatting with it.

Project website: https://librefang.ai/

Table of Contents


Installation

Option 1: Cargo Install (Any Platform)

LibreFang does not publish GitHub Releases yet. The current recommended install path is:

cargo install --git https://github.com/librefang/librefang librefang-cli

Or build from source:

git clone https://github.com/librefang/librefang.git
cd librefang
cargo install --path crates/librefang-cli

Option 2: Shell Installer (Linux / macOS, for future releases)

curl -fsSL https://librefang.ai/install.sh | sh

Use this once LibreFang starts publishing GitHub Releases. The script installs the CLI binary to ~/.librefang/bin/.

Option 3: PowerShell Installer (Windows, for future releases)

irm https://librefang.ai/install.ps1 | iex

Use this once LibreFang starts publishing GitHub Releases. The script verifies SHA256 checksums and adds the CLI to your user PATH.

Option 4: Docker

docker pull ghcr.io/librefang/librefang:latest

docker run -d \
  --name librefang \
  -p 4200:4200 \
  -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
  -v librefang-data:/data \
  ghcr.io/librefang/librefang:latest

Or use Docker Compose:

git clone https://github.com/librefang/librefang.git
cd librefang
# Set your API keys in environment or .env file
docker compose up -d

Verify Installation

librefang --version

Configuration

Initialize

Run the init command to create the ~/.librefang/ directory and a default config file:

librefang init

This creates:

~/.librefang/
  config.toml    # Main configuration
  data/          # Database and runtime data
  agents/        # Agent manifests (optional)

Set Up an API Key

LibreFang needs at least one LLM provider API key. Set it as an environment variable:

# Anthropic (Claude)
export ANTHROPIC_API_KEY=sk-ant-...

# Or OpenAI
export OPENAI_API_KEY=sk-...

# Or Groq (free tier available)
export GROQ_API_KEY=gsk_...

Add the export to your shell profile (~/.bashrc, ~/.zshrc, etc.) to persist it.

Edit the Config

The default config uses Anthropic. To change the provider, edit ~/.librefang/config.toml:

[default_model]
provider = "groq"                      # anthropic, openai, groq, ollama, etc.
model = "llama-3.3-70b-versatile"      # Model identifier for the provider
api_key_env = "GROQ_API_KEY"           # Env var holding the API key

[memory]
decay_rate = 0.05                      # Memory confidence decay rate

[network]
listen_addr = "127.0.0.1:4200"        # OFP listen address

Verify Your Setup

librefang doctor

This checks that your config exists, API keys are set, and the toolchain is available.


Spawn Your First Agent

Using a Built-in Template

LibreFang ships with 30 agent templates. Spawn the hello-world agent:

librefang agent spawn agents/hello-world/agent.toml

Output:

Agent spawned successfully!
  ID:   a1b2c3d4-e5f6-...
  Name: hello-world

Using a Custom Manifest

Create your own my-agent.toml:

name = "my-assistant"
version = "0.1.0"
description = "A helpful assistant"
author = "you"
module = "builtin:chat"

[model]
provider = "groq"
model = "llama-3.3-70b-versatile"

[capabilities]
tools = ["file_read", "file_list", "web_fetch"]
memory_read = ["*"]
memory_write = ["self.*"]

Then spawn it:

librefang agent spawn my-agent.toml

List Running Agents

librefang agent list

Output:

ID                                     NAME             STATE      PROVIDER     MODEL
-----------------------------------------------------------------------------------------------
a1b2c3d4-e5f6-...                     hello-world      Running    groq         llama-3.3-70b-versatile

Chat with an Agent

Start an interactive chat session using the agent ID:

librefang agent chat a1b2c3d4-e5f6-...

Or use the quick chat command (picks the first available agent):

librefang chat

Or specify an agent by name:

librefang chat hello-world

Example session:

Chat session started (daemon mode). Type 'exit' or Ctrl+C to quit.

you> Hello! What can you do?

agent> I'm the hello-world agent running on LibreFang. I can:
- Read files from the filesystem
- List directory contents
- Fetch web pages

Try asking me to read a file or look up something on the web!

  [tokens: 142 in / 87 out | iterations: 1]

you> List the files in the current directory

agent> Here are the files in the current directory:
- Cargo.toml
- Cargo.lock
- README.md
- agents/
- crates/
- docs/
...

you> exit
Chat session ended.

Start the Daemon

For persistent agents, multi-user access, and the WebChat UI, start the daemon:

librefang start

Output:

Starting LibreFang daemon...
LibreFang daemon running on http://127.0.0.1:4200
Press Ctrl+C to stop.

The daemon provides:

  • REST API at http://127.0.0.1:4200/api/
  • WebSocket endpoint at ws://127.0.0.1:4200/api/agents/{id}/ws
  • WebChat UI at http://127.0.0.1:4200/
  • OFP networking on port 4200

Check Status

librefang status

Stop the Daemon

Press Ctrl+C in the terminal running the daemon, or:

curl -X POST http://127.0.0.1:4200/api/shutdown

Using the WebChat UI

With the daemon running, open your browser to:

http://127.0.0.1:4200/

The embedded WebChat UI allows you to:

  • View all running agents
  • Chat with any agent in real-time (via WebSocket)
  • See streaming responses as they are generated
  • View token usage per message

Next Steps

Now that you have LibreFang running:

  • Explore agent templates: Browse the agents/ directory for 30 pre-built agents (coder, researcher, writer, ops, analyst, security-auditor, and more).
  • Create custom agents: Write your own agent.toml manifests. See the Architecture guide for details on capabilities and scheduling.
  • Set up channels: Connect any of 40 messaging platforms (Telegram, Discord, Slack, WhatsApp, LINE, Mastodon, and 34 more). See Channel Adapters.
  • Use bundled skills: 60 expert knowledge skills are pre-installed (GitHub, Docker, Kubernetes, security audit, prompt engineering, etc.). See Skill Development.
  • Build custom skills: Extend agents with Python, WASM, or prompt-only skills. See Skill Development.
  • Use the API: 76 REST/WS/SSE endpoints, including an OpenAI-compatible /v1/chat/completions. See API Reference.
  • Switch LLM providers: 20 providers supported (Anthropic, OpenAI, Gemini, Groq, DeepSeek, xAI, Ollama, and more). Per-agent model overrides.
  • Set up workflows: Chain multiple agents together. Use librefang workflow create with a TOML workflow definition.
  • Use MCP: Connect to external tools via Model Context Protocol. Configure in config.toml under [[mcp_servers]].
  • Migrate from OpenClaw: Run librefang migrate --from openclaw. See MIGRATION.md.
  • Desktop app: Run cargo tauri dev for a native desktop experience with system tray.
  • Run diagnostics: librefang doctor checks your entire setup.

Useful Commands Reference

librefang init                          # Initialize ~/.librefang/
librefang start                         # Start the daemon
librefang status                        # Check daemon status
librefang doctor                        # Run diagnostic checks

librefang agent spawn <manifest.toml>   # Spawn an agent
librefang agent list                    # List all agents
librefang agent chat <id>               # Chat with an agent
librefang agent kill <id>               # Kill an agent

librefang workflow list                 # List workflows
librefang workflow create <file.json>   # Create a workflow
librefang workflow run <id> <input>     # Run a workflow

librefang trigger list                  # List event triggers
librefang trigger create <args>         # Create a trigger
librefang trigger delete <id>           # Delete a trigger

librefang skill install <source>        # Install a skill
librefang skill list                    # List installed skills
librefang skill search <query>          # Search FangHub
librefang skill create                  # Scaffold a new skill

librefang channel list                  # List channel status
librefang channel setup <channel>       # Interactive setup wizard

librefang config show                   # Show current config
librefang config edit                   # Open config in editor

librefang chat [agent]                  # Quick chat (alias)
librefang migrate --from openclaw       # Migrate from OpenClaw
librefang mcp                           # Start MCP server (stdio)