Fred is both:
- An innovation lab — helping developers rapidly explore agentic patterns, domain-specific logic, and custom tools.
- A production-ready platform — already integrated with real enterprise constraints: auth, security, document lifecycle, and deployment best practices.
It is composed of:
- a Python agentic backend (FastAPI + LangGraph)
- a Python knowledge flow backend (FastAPI) for document ingestion and vector search
- a React frontend
Fred is not a framework, but a full reference implementation that shows how to build practical multi-agent applications with LangChain and LangGraph. Agents cooperate to answer technical, context-aware questions.
See the project site: https://fredk8.dev
Contents:
- Getting started
- Production mode
- Agent coding academy
- Advanced configuration
- Core Architecture and Licensing Clarity
- Documentation
- Contributing
- Community
- Contacts
To ensure a smooth first-time experience, Fred’s maintainers designed Dev Container/Native startup to require no additional external components (except, of course, to LLM APIs).
By default, using either Dev Container or native startup:
- Fred stores all data on the local filesystem or through local-first tools such as DuckDB (for SQL-like data) and ChromaDB (for local embeddings). Data includes metrics, chat conversations, document uploads, and embeddings.
- Authentication and authorization are mocked.
Note:
Accross all setup modes, a common requirement is to have access to Large Language Model (LLM) APIs via a model provider. Supported options include:
- Public OpenAI APIs: Connect using your OpenAI API key.
- Private Ollama Server: Host open-source models such as Mistral, Qwen, Gemma, and Phi locally or on a shared server.
- Private Azure AI Endpoints: Connect using your Azure OpenAI key.
Detailed instructions for configuring your chosen model provider are provided below.
Choose how you want to prepare Fred's development environment:
Details
Prefer an isolated environment with everything pre-installed?
The Dev Container setup takes care of all dependencies related to agentic backend, knowledge-flow backend, and frontend components.
Tool | Purpose |
---|---|
Docker / Docker Desktop | Runs the container |
VS Code | Primary IDE |
Dev Containers extension (ms-vscode-remote.remote-containers ) |
Opens the repo inside the container |
- Clone (or open) the repository in VS Code.
- Press F1 → Dev Containers: Reopen in Container.
- VS Code builds
.devcontainer/Dockerfile-devcontainer
and runs thepost-create
script.
On first start the script:
- Installs the Python virtual environments for
fred-core
,agentic_backend
, andknowledge-flow-backend
- Installs
frontend/node_modules
When the terminal prompt appears, the workspace is ready. Ports 8000
(Agentic backend), 8111
(Knowledge Flow backend), and 5173
(Frontend (vite)) are automatically forwarded to the host.
- Rebuild the container: F1 → Dev Containers: Rebuild Container
- Dependencies feel stale? Delete the relevant
.venv
orfrontend/node_modules
inside the container, then rerun the associatedmake
target. - Need to change API keys or models? Update the backend
.env
files inside the container and restart the relevant service. See Model configuration for more details.