Skip to content

SPADE-LLM

Build distributed based-XMPP multi-agent systems powered by Large Language Models. Extends SPADE multi-agent platform with many LLM providers for distributed AI applications, intelligent chatbots, and collaborative agent systems.

Key Features

8+
LLM Providers
100%
Python Native
0
External Dependencies
Agent Scalability
🔧

Built-in XMPP Server

No external server setup required with SPADE 4.0+. Get started instantly with zero configuration.

🧠

Multi-Provider Support

OpenAI GPT, Ollama, LM Studio, vLLM, Anthropic Claude and more. Switch providers seamlessly.

Advanced Tool System

Function calling with async execution, human-in-the-loop workflows, and LangChain integration.

💾

Dual Memory Architecture

Agent learning and conversation continuity with SQLite persistence and contextual retrieval.

🎯

Context Management

Multi-conversation support with automatic cleanup and intelligent context window management.

🛡️

Guardrails System

Content filtering and safety controls for input/output with customizable rules and policies.

🔗

Message Routing

Conditional routing based on LLM responses with flexible workflows and decision trees.

🌐

MCP Integration

Model Context Protocol server support for external tool integration and service connectivity.

Architecture Overview

graph LR
    A[LLMAgent] --> C[ContextManager]
    A --> D[LLMProvider]
    A --> E[LLMTool]
    A --> G[Guardrails]
    A --> M[Memory]
    D --> F[OpenAI/Ollama/etc]
    G --> H[Input/Output Filtering]
    E --> I[Human-in-the-Loop]
    E --> J[MCP]
    E --> P[CustomTool/LangchainTool]
    J --> K[STDIO]
    J --> L[HTTP Streaming]
    M --> N[Agent-based]
    M --> O[Agent-thread]

Quick Start

🐍 Basic Agent Setup
import spade
from spade_llm import LLMAgent, LLMProvider

async def main():
    # First, start SPADE's built-in server:
    # spade run

    provider = LLMProvider.create_openai(
        api_key="your-api-key",
        model="gpt-4o-mini"
    )

    agent = LLMAgent(
        jid="assistant@localhost",
        password="password",
        provider=provider,
        system_prompt="You are a helpful assistant"
    )

    await agent.start()

if __name__ == "__main__":
    spade.run(main())
Python 3.10+ MIT License Beta Release

Documentation Structure

Getting Started

Core Guides

Reference

Examples

Explore the examples directory for complete working examples:

  • multi_provider_chat_example.py - Chat with different LLM providers
  • ollama_with_tools_example.py - Local models with tool calling
  • guardrails_example.py - Content filtering and safety controls
  • langchain_tools_example.py - LangChain tool integration
  • valencia_multiagent_trip_planner.py - Multi-agent workflow