Skip to content

BAI-LAB/MemoryOS

Repository files navigation

MemoryOS

logo

Mem0 Discord Mem0 PyPI - Downloads Npm package Discord License: Apache 2.0

🎉 If you like our project, please give us a star ⭐ on GitHub for the latest update.

MemoryOS is designed to provide a memory operating system for personalized AI agents, enabling more coherent, personalized, and context-aware interactions. Drawing inspiration from memory management principles in operating systems, it adopts a hierarchical storage architecture with four core modules: Storage, Updating, Retrieval, and Generation, to achieve comprehensive and efficient memory management. On the LoCoMo benchmark, the model achieved average improvements of 49.11% and 46.18% in F1 and BLEU-1 scores.

✨Key Features

  • 🏆 TOP Performance in Memory Management
    The SOTA results in long-term memory benchmarks, boosting F1 scores by 49.11% and BLEU-1 by 46.18% on the LoCoMo benchmark.

  • 🧠 Plug-and-Play Memory Management Architecture
    Enables seamless integration of pluggable memory modules—including storage engines, update strategies, and retrieval algorithms.

  • Agent Workflow Create with Ease (MemoryOS-MCP)
    Inject long-term memory capabilities into various AI applications by calling modular tools provided by the MCP Server.

  • 🌐 Universal LLM Support
    MemoryOS seamlessly integrates with a wide range of LLMs (e.g., OpenAI, Deepseek, Qwen ...)

📣 Latest News

  • [new] 🔥🔥 🔥 [2025-07-15]: 🔌 Support for Vector Database Chromadb
  • [new] 🔥🔥 🔥 [2025-07-15]: 🔌 Integrate Docker into deployment
  • [new] 🔥🔥 [2025-07-14]: ⚡ Acceleration of MCP parallelization
  • [new] 🔥🔥 [2025-07-14]: 🔌 Support for BGE-M3 & Qwen3 embeddings on PyPI and MCP.
  • [new] 🔥 [2025-07-09]: 📊 Evaluation of the MemoryOS on LoCoMo Dataset: Publicly Available 👉Reproduce.
  • [new] 🔥 [2025-07-08]: 🏆 New Config Parameter
  • New parameter configuration: similarity_threshold. For configuration file, see 📖 Documentation page.
  • [new] [2025-07-07]: 🚀5 Times Faster
  • The MemoryOS (PYPI) implementation has been upgraded: 5 times faster (reduction in latency) through parallelization optimizations.
  • [new] [2025-07-07]: ✨R1 models Support Now
  • MemoryOS supports configuring and using inference models such as Deepseek-r1 and Qwen3..
  • [new] [2025-07-07]: ✨MemoryOS Playground Launched
  • The Playground of MemoryOS Platform has been launched! 👉MemoryOS Platform. If you need an Invitation Code, please feel free to reach Contact US.
  • [new] [2025-06-15]:🛠️ Open-sourced MemoryOS-MCP released! Now configurable on agent clients for seamless integration and customization. 👉 MemoryOS-MCP.
  • [2025-05-30]: 📄 Paper-Memory OS of AI Agent is available on arXiv: https://arxiv.org/abs/2506.06326.
  • [2025-05-30]: Initial version of MemoryOS launched! Featuring short-term, mid-term, and long-term persona Memory with automated user profile and knowledge updating.

🔥 MemoryOS Support List

Type Name Open Source Support Configuration Description
Agent Client Claude Desktop claude_desktop_config.json Anthropic official client
Cline VS Code settings VS Code extension
Cursor Settings panel AI code editor
Model Provider OpenAI OPENAI_API_KEY GPT-4, GPT-3.5, etc.
Anthropic ANTHROPIC_API_KEY Claude series
Deepseek-R1 DEEPSEEK_API_KEY Chinese large model
Qwen/Qwen3 QWEN_API_KEY Alibaba Qwen
vLLM Local deployment Local model inference
Llama_factory Local deployment Local fine-tuning deployment
All model calls use the OpenAI API interface; you need to supply the API key and base URL.

📑 Table of Contents

🏗️ System Architecture

image

🏗️ Project Structure

memoryos/
├── __init__.py            # Initializes the MemoryOS package
├── __pycache__/           # Python cache directory (auto-generated)
├── long_term.py           # Manages long-term persona memory (user profile, knowledge)
├── memoryos.py            # Main class for MemoryOS, orchestrating all components
├── mid_term.py            # Manages mid-term memory, consolidating short-term interactions
├── prompts.py             # Contains prompts used for LLM interactions (e.g., summarization, analysis)
├── retriever.py           # Retrieves relevant information from all memory layers
├── short_term.py          # Manages short-term memory for recent interactions
├── updater.py             # Processes memory updates, including promoting information between layers
└── utils.py               # Utility functions used across the library

📖MemoryOS_PyPi Getting Started

Prerequisites

  • Python >= 3.10
  • conda create -n MemoryOS python=3.10
  • conda activate MemoryOS

Installation

Download from PyPi

pip install memoryos-pro -i https://pypi.org/simple

Download from GitHub (latest version)

git clone https://github.com/BAI-LAB/MemoryOS.git
cd MemoryOS/memoryos-pypi
pip install -r requirements.txt

Basic Usage

import os
from memoryos import Memoryos

# --- Basic Configuration ---
USER_ID = "demo_user"
ASSISTANT_ID = "demo_assistant"
API_KEY = "YOUR_OPENAI_API_KEY"  # Replace with your key
BASE_URL = ""  # Optional: if using a custom OpenAI endpoint
DATA_STORAGE_PATH = "./simple_demo_data"
LLM_MODEL = "gpt-4o-mini"

def simple_demo():
    print("MemoryOS Simple Demo")
    
    # 1. Initialize MemoryOS
    print("Initializing MemoryOS...")
    try:
        memo = Memoryos(
            user_id=USER_ID,
            openai_api_key=API_KEY,
            openai_base_url=BASE_URL,
            data_storage_path=DATA_STORAGE_PATH,
            llm_model=LLM_MODEL,
            assistant_id=ASSISTANT_ID,
            short_term_capacity=7,  
            mid_term_heat_threshold=5,  
            retrieval_queue_capacity=7,
            long_term_knowledge_capacity=100,
            #Support Qwen/Qwen3-Embedding-0.6B, BAAI/bge-m3, all-MiniLM-L6-v2
            embedding_model_name="BAAI/bge-m3"
        )
        print("MemoryOS initialized successfully!\n")
    except Exception as e:
        print(f"Error: {e}")
        return

    # 2. Add some basic memories
    print("Adding some memories...")
    
    memo.add_memory(
        user_input="Hi! I'm Tom, I work as a data scientist in San Francisco.",
        agent_response="Hello Tom! Nice to meet you. Data science is such an exciting field. What kind of data do you work with?"
    )
     
    test_query = "What do you remember about my job?"
    print(f"User: {test_query}")
    
    response = memo.get_response(
        query=test_query,
    )
    
    print(f"Assistant: {response}")

if __name__ == "__main__":
    simple_demo()

📖 MemoryOS-MCP Getting Started

🔧 Core Tools

1. add_memory

Saves the content of the conversation between the user and the AI assistant into the memory system, for the purpose of building a persistent dialogue history and contextual record.

2. retrieve_memory

Retrieves related historical dialogues, user preferences, and knowledge information from the memory system based on a query, helping the AI assistant understand the user’s needs and background.

3. get_user_profile

Obtains a user profile generated from the analysis of historical dialogues, including the user’s personality traits, interest preferences, and relevant knowledge background.

1. Install dependencies

cd memoryos-mcp
pip install -r requirements.txt

2. configuration

Edit config.json

{
  "user_id": "user ID",
  "openai_api_key": "OpenAI API key",
  "openai_base_url": "https://api.openai.com/v1",
  "data_storage_path": "./memoryos_data",
  "assistant_id": "assistant_id",
  "llm_model": "gpt-4o-mini"
  "embedding_model_name":"BAAI/bge-m3"
}

3. Start the server

python server_new.py --config config.json

4. Test

python test_comprehensive.py

5. Configure it on Cline and other clients

Copy the mcp.json file over, and make sure the file path is correct.

command": "/root/miniconda3/envs/memos/bin/python"
#This should be changed to the Python interpreter of your virtual environment

📖MemoryOS_Chromadb Getting Started

1. Install dependencies

cd memoryos-chromadb
pip install -r requirements.txt

2. Test

The edit information is in comprehensive_test.py
    memoryos = Memoryos(
        user_id='travel_user_test',
        openai_api_key='',
        openai_base_url='',
        data_storage_path='./comprehensive_test_data',
        assistant_id='travel_assistant',
        embedding_model_name='BAAI/bge-m3',
        mid_term_capacity=1000,
        mid_term_heat_threshold=13.0,
        mid_term_similarity_threshold=0.7,
        short_term_capacity=2
    )
python3 comprehensive_test.py
# Make sure to use a different data storage path when switching embedding models.

📖Docker Getting Started

You can run MemoryOS using Docker in two ways: by pulling the official image or by building your own image from the Dockerfile. Both methods are suitable for quick setup, testing, and production deployment.

Option 1: Pull the Official Image

# Pull the latest official image
docker pull ghcr.io/bai-lab/memoryos:latest

docker run -it --gpus=all ghcr.io/bai-lab/memoryos /bin/bash

Option 2: Build from Dockerfile

# Clone the repository
git clone https://github.com/BAI-LAB/MemoryOS.git
          
cd MemoryOS

# Build the Docker image (make sure Dockerfile is present)
docker build -t memoryos .

docker run -it --gpus=all memoryos /bin/bash

🎯Reproduce

cd eval
Configure API keys and other settings in the code
python3 main_loco_parse.py
python3 evalution_loco.py

☑️ Todo List

MemoryOS is continuously evolving! Here's what's coming:

  • Ongoing🚀: Integrated Benchmarks: Standardized benchmark suite with a cross-model comparison for Mem0, Zep, and OpenAI
  • 🏗️ Enabling seamless Memory exchange and integration across diverse systems.

Have ideas or suggestions? Contributions are welcome! Please feel free to submit issues or pull requests! 🚀

📖 Documentation

A more detailed documentation is coming soon 🚀, and we will update in the Documentation page.

📣 Citation

If you find this project useful, please consider citing our paper:

@misc{kang2025memoryosaiagent,
      title={Memory OS of AI Agent}, 
      author={Jiazheng Kang and Mingming Ji and Zhe Zhao and Ting Bai},
      year={2025},
      eprint={2506.06326},
      archivePrefix={arXiv},
      primaryClass={cs.AI},
      url={https://arxiv.org/abs/2506.06326}, 
}

🎯 Contact us

BaiJia AI is a research team guided by Associate Professor Bai Ting from Beijing University of Posts and Telecommunications, dedicated to creating emotionally rich and super-memory brains for AI agents.

🤝 Cooperation and Suggestions: baiting@bupt.edu.cn

📣Follow our WeChat official account, join the WeChat group or Discord https://discord.gg/SqVj7QvZ to get the latest updates.

百家Agent公众号 微信群二维码

🌟 Star History

Star History Chart

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy