AI / MLPython

memora

Memora provides AI agents with persistent memory capabilities.

Free
N/A rating (0 reviews)0 installs375 GitHub stars
persistent memorysemantic storageknowledge graphs

Memora is a lightweight MCP server designed for semantic memory storage, enabling AI agents to retain information across sessions. It supports knowledge graphs and conversational recall, allowing for a more context-aware interaction with users. With features like hierarchical organization and advanced search capabilities, Memora enhances the ability of AI agents to manage and retrieve information effectively. The server utilizes SQLite for persistent storage, with options for cloud synchronization. Users can easily export and import their data, ensuring that their memories are backed up and can be restored with various merge strategies. Additionally, Memora offers advanced querying features, including semantic search using vector embeddings, making it a powerful tool for AI-driven applications.

Compatible with

Claude DesktopCursor

Install

Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "memora": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-memora"
      ]
    }
  }
}

Config File Location

Mac: ~/Library/Application Support/Claude/claude_desktop_config.json

Windows: %APPDATA%\Claude\claude_desktop_config.json

Linux: ~/.config/claude/claude_desktop_config.json

Some servers require additional setup - check the GitHub README for specific instructions.

Permissions

ok0
ok1