LibSQL Memory

LibSQL Memory

spences10

Provides a persistent LibSQL database for storing knowledge graph entities and relationships that persist across AI conversations. Enables AI assistants to remember and build upon previous interactions.

Provides a LibSQL-based persistent memory database for storing and retrieving knowledge graph entities and relations across conversations.

81443 views18Local (stdio)

What it does

  • Store entities and relationships in knowledge graphs
  • Search stored knowledge with fuzzy text matching
  • Persist memory across conversation sessions
  • Connect to local SQLite or remote LibSQL databases
  • Rank search results by relevance
  • Manage knowledge graph relationships

Best for

AI assistants that need long-term memoryBuilding persistent knowledge bases from conversationsLLM applications requiring context continuityKnowledge graph management for AI systems
High-performance text search with rankingWorks with local and remote databasesOptimized for LLM context efficiency

About LibSQL Memory

LibSQL Memory is a community-built MCP server published by spences10 that provides AI assistants with tools and capabilities via the Model Context Protocol. LibSQL Memory offers a persistent memory database using LibSQL to store and retrieve knowledge graph entities and relati It is categorized under ai ml, databases.

How to install

You can install LibSQL Memory in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

License

LibSQL Memory is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

mcp-memory-libsql

A high-performance, persistent memory system for the Model Context Protocol (MCP) powered by libSQL with optimized text search for LLM context efficiency.

Glama badge

Features

  • 🚀 High-performance text search with relevance ranking
  • 💾 Persistent storage of entities and relations
  • 🔍 Flexible text search with fuzzy matching
  • 🎯 Context-optimized for LLM efficiency
  • 🔄 Knowledge graph management
  • 🌐 Compatible with local and remote libSQL databases
  • 🔒 Secure token-based authentication for remote databases

Configuration

This server is designed to be used as part of an MCP configuration. Here are examples for different environments:

Cline Configuration

Add this to your Cline MCP settings:

{
	"mcpServers": {
		"mcp-memory-libsql": {
			"command": "npx",
			"args": ["-y", "mcp-memory-libsql"],
			"env": {
				"LIBSQL_URL": "file:/path/to/your/database.db"
			}
		}
	}
}

Claude Desktop with WSL Configuration

For a detailed guide on setting up this server with Claude Desktop in WSL, see Getting MCP Server Working with Claude Desktop in WSL.

Add this to your Claude Desktop configuration for WSL environments:

{
	"mcpServers": {
		"mcp-memory-libsql": {
			"command": "wsl.exe",
			"args": [
				"bash",
				"-c",
				"source ~/.nvm/nvm.sh && LIBSQL_URL=file:/path/to/database.db /home/username/.nvm/versions/node/v20.12.1/bin/npx mcp-memory-libsql"
			]
		}
	}
}

Database Configuration

The server supports both local SQLite and remote libSQL databases through the LIBSQL_URL environment variable:

For local SQLite databases:

{
	"env": {
		"LIBSQL_URL": "file:/path/to/database.db"
	}
}

For remote libSQL databases (e.g., Turso):

{
	"env": {
		"LIBSQL_URL": "libsql://your-database.turso.io",
		"LIBSQL_AUTH_TOKEN": "your-auth-token"
	}
}

Note: When using WSL, ensure the database path uses the Linux filesystem format (e.g., /home/username/...) rather than Windows format.

By default, if no URL is provided, it will use file:/memory-tool.db in the current directory.

API

The server implements the standard MCP memory interface with optimized text search:

  • Entity Management
    • Create/Update entities with observations
    • Delete entities
    • Search entities by text with relevance ranking
    • Explore entity relationships
  • Relation Management
    • Create relations between entities
    • Delete relations
    • Query related entities

Architecture

The server uses a libSQL database with the following schema:

  • Entities table: Stores entity information with timestamps
  • Observations table: Stores entity observations
  • Relations table: Stores relationships between entities
  • Text search with relevance ranking (name > type > observation)

Development

Publishing

Due to npm 2FA requirements, publishing needs to be done manually:

  1. Create a changeset (documents your changes):
pnpm changeset
  1. Version the package (updates version and CHANGELOG):
pnpm changeset version
  1. Publish to npm (will prompt for 2FA code):
pnpm release

Contributing

Contributions are welcome! Please read our contributing guidelines before submitting pull requests.

License

MIT License - see the LICENSE file for details.

Acknowledgments

Alternatives

Related Skills

Browse all skills
context-manager

Elite AI context engineering specialist mastering dynamic context management, vector databases, knowledge graphs, and intelligent memory systems. Orchestrates context across multi-agent workflows, enterprise AI systems, and long-running projects with 2024/2025 best practices. Use PROACTIVELY for complex AI orchestration.

0
literature-review

Conduct comprehensive, systematic literature reviews using multiple academic databases (PubMed, arXiv, bioRxiv, Semantic Scholar, etc.). This skill should be used when conducting systematic literature reviews, meta-analyses, research synthesis, or comprehensive literature searches across biomedical, scientific, and technical domains. Creates professionally formatted markdown documents and PDFs with verified citations in multiple citation styles (APA, Nature, Vancouver, etc.).

144
postgresql-psql

Comprehensive guide for PostgreSQL psql - the interactive terminal client for PostgreSQL. Use when connecting to PostgreSQL databases, executing queries, managing databases/tables, configuring connection options, formatting output, writing scripts, managing transactions, and using advanced psql features for database administration and development.

23
context-optimizer

Advanced context management with auto-compaction and dynamic context optimization for DeepSeek's 64k context window. Features intelligent compaction (merging, summarizing, extracting), query-aware relevance scoring, and hierarchical memory system with context archive. Logs optimization events to chat.

22
python-performance-optimization

Profile and optimize Python code using cProfile, memory profilers, and performance best practices. Use when debugging slow Python code, optimizing bottlenecks, or improving application performance.

16
langchain

Framework for building LLM-powered applications with agents, chains, and RAG. Supports multiple providers (OpenAI, Anthropic, Google), 500+ integrations, ReAct agents, tool calling, memory management, and vector store retrieval. Use for building chatbots, question-answering systems, autonomous agents, or RAG applications. Best for rapid prototyping and production deployments.

15