P

mem0-mcp

...
Created 3/12/2025bypinkpixel-dev

Categories

agentassistantllmllm-memorymcpmcp-servermem0mem0aimemory

Language:

Python

Stars:

35

Forks:

3

✨ mem0 Memory System ✨

mem0 MCP Server Logo

A flexible memory system for AI applications that can be used in two ways:

  1. As an MCP (Memory Capabilities Provider) server for integration with MCP-compatible applications
  2. As a direct library integration for embedding memory capabilities directly in your applications

Made with ❤️ by Pink Pixel

Features

  • Multi-Provider Support: Use OpenAI, Anthropic, Google, DeepSeek, OpenRouter, or Ollama (local)
  • Flexible Embedding Options: Choose from OpenAI, HuggingFace, or Ollama for embeddings
  • Local Storage: Store memories locally with ChromaDB and SQLite
  • Configurable: Customize data directories, models, and parameters
  • Autonomous Memory: Automatically extracts, stores, and retrieves user information without explicit commands
  • User Isolation: Support for multiple users with isolated memory spaces
  • Two Integration Methods: Server-based (MCP) or direct library integration

Installation

There are multiple ways to install and set up the mem0 MCP server:

Method 1: Manual Installation

  1. Clone the repository and navigate to the directory:

    git clone https://github.com/pinkpixel-dev/mem0-mcp.git
    cd mem0-mcp
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Create a .env file from the example:

    cp .env.example .env
    
  4. Edit the .env file with your API keys and settings (see Environment Configuration Guide for details).

Method 2: Using the Installer Script

The project includes a convenient installer script that automates the setup process:

./install.sh

The installer script provides the following features:

  • Creates a Python virtual environment

  • Installs all dependencies

  • Sets up environment configuration

  • Provides a guided setup experience with visual feedback

              You can also customize the installation with options:
    
# For a quick, non-interactive installation
./install.sh --quick

# To specify a custom environment directory
./install.sh --env-dir ./custom_env

# To use a specific installation method (pip, uv, or conda)
./install.sh --method pip

Running the Server

There are multiple ways to run the mem0 MCP server:

Method 1: Using the Python Script Directly

Start the server with default settings:

python server.py

The server will automatically find the next available port if port 8000 is already in use.

Or customize with command-line arguments:

python server.py --host 127.0.0.1 --port 8080 --provider ollama --embedding-provider ollama --data-dir ./custom_memory_data

Additional options:

# Disable automatic port finding
python server.py --no-auto-port

# Enable auto-reload for development
python server.py --reload

Method 2: Using the Run Server Script

For a more convenient experience, use the included shell script:

./run_server.sh

This script supports the same options as the Python script:

# Customize host and port
./run_server.sh --host 127.0.0.1 --port 8080

# Specify providers
./run_server.sh --provider openai --embedding ollama

# Set a custom data directory
./run_server.sh --data-dir ~/mem0_data

Method 3: Using the Launcher for MCP-Compatible Applications

For integrating with command-line applications like Cursor that use MCP, use the launcher script:

python start_mem0_server.py

This script is specifically designed for MCP integration and outputs the server information in the format expected by MCP clients. It supports the same options as the other methods:

# Customize settings
python start_mem0_server.py --host 127.0.0.1 --port 8080 --provider ollama

# Run in quiet mode (reduced output)
python start_mem0_server.py --quiet

Integration Methods

            ### Method 1: MCP Server (for MCP-Compatible Applications)

The MCP server provides a RESTful API that can be used by any application that supports the Machine Communication Protocol (MCP).

API Endpoints

EndpointMethodDescription
/configurePOSTConfigure the memory provider
/memory/addPOSTAdd a memory
/memory/searchPOSTSearch for memories
/memory/chatPOSTChat with the AI using memories
/memory/{memory_id}GETGet a memory by ID
/memory/{memory_id}DELETEDelete a memory by ID
/memoriesDELETEClear all memories
/healthGETHealth check
/providersGETList available providers

Using the Client

from client import Mem0Client

# Initialize the client
client = Mem0Client(base_url="http://localhost:8000")

# Configure the memory provider
client.configure(
    provider="openai",
    embedding_provider="openai",
    data_dir="./memory_data"
)

# Add a memory
memory_id = client.add_memory(
    content="This is an important fact to remember",
    user_id="user123"
)

# Search for memories
results = client.search_memories(
    query="important fact",
    user_id="user123"
)

# Chat with context from memories
response = client.chat(
    message="What important information do you have?",
    user_id="user123"
)

print(response)

Integration with MCP-Compatible Applications

To integrate with applications that support MCP servers:

  1. Start the mem0 MCP server using the launcher script:

    python start_mem0_server.py
    
  2. The script will output MCP server information in the format expected by MCP clients:

    {"name": "mem0", "capabilities": ["memory"], "url": "http://0.0.0.0:8000", "version": "1.0.0"}
    
  3. Configure your application to use the MCP server URL

  4. Use the memory capabilities through your application's interface

             For detailed instructions, see the [MCP Integration Guide](MCP_INTEGRATION_GUIDE.md).
    

Method 2: Direct Library Integration (for Any Application)

You can directly integrate the memory system into your own applications without running a separate server:

from mem0.memory import MemoryManager
from mem0.providers import OpenAIProvider, OllamaEmbeddingProvider

# Initialize the memory manager
memory_manager = MemoryManager(
    provider=OpenAIProvider(api_key="your_openai_api_key"),
    embedding_provider=OllamaEmbeddingProvider(),
    data_dir="./memory_data"
)

# Add a memory
memory_id = memory_manager.add_memory(
    content="This is an important fact to remember",
    user_id="user123"
)

# Search for memories
results = memory_manager.search_memories(
    query="important fact",
    user_id="user123"
)

# Get memory by ID
memory = memory_manager.get_memory(memory_id)

# Delete a memory
memory_manager.delete_memory(memory_id)

# Clear all memories for a user
memory_manager.clear_memories(user_id="user123")

For detailed instructions, see the Direct Integration Guide.

Autonomous Memory System

The mem0 memory system includes an autonomous memory feature that can:

  1. Automatically extract important information from user interactions
  2. Store memories without explicit commands
  3. Retrieve relevant memories when needed for context
  4. Enhances the AI's responses by injecting memories into the context

This creates a seamless experience where the AI naturally remembers details about the user without requiring explicit memory commands.

Try the Autonomous Memory Example

The repository includes an example of the autonomous memory system:

python examples/autonomous_memory.py

This example demonstrates:

  • Automatic extraction of personal information
  • Contextual retrieval of memories
  • Natural incorporation of memories into responses

Environment Configuration

            The mem0 memory system can be configured using environment variables or a `.env` file:
# LLM Provider Configuration
MEM0_PROVIDER=openai
OPENAI_API_KEY=your_openai_api_key

# Embedding Provider Configuration
MEM0_EMBEDDING_PROVIDER=openai

# Storage Configuration
MEM0_DATA_DIR=~/mem0_memories

# Server Configuration
MEM0_HOST=0.0.0.0
MEM0_PORT=8000

For a complete list of configuration options, see the Environment Configuration Guide.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgements


Made with ❤️ by Pink Pixel | GitHub

Last updated: 4/5/2025

Publisher info

pinkpixel-dev's avatar

Pink Pixel

United States of America
3
followers
0
following
4
repos

More MCP servers built with Python

apollo-io-mcp-server

MCP server that exposes the Apollo.io API functionalities as tools

By Edward Choh
mcp-openvision

MCP Server using OpenRouter models to get descriptions for images

By Nazruden2
DeepView MCP

Enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini's extensive context window.

By ai-1st