An official MCP Server for Qdrant, providing GDPR-compliant vector search and integration of AI memory with chat platforms via the Model Context Protocol.
Loading more......
A purpose-built MCP server that enables semantic data retrieval via vector embeddings, allowing AI systems to perform meaning-based searches in large datasets. Qdrant is a leading example, offering standard protocols for vector operations.
A Pinecone MCP server providing vector search capabilities over Pinecone through the Model Context Protocol.
An MCP server integrating Pinecone for advanced vector management, offering features like automatic namespace partitioning, metadata-aware chunking, and cost-optimized upserts for high-performance data recall.
MCP server for Milvus/Zilliz vector databases, enabling direct interaction with your database through the MCP protocol.
A Qdrant MCP server that enables MCP protocol support for Qdrant vector databases.
The Qdrant MCP Server integrates with the Qdrant vector search engine, allowing AI agents to store and retrieve semantic information using MCP, ideal for advanced AI memory and retrieval tasks.
Category: database-messaging-mcp-servers
Tags: qdrant, vector-database, gdpr, ai-integration, mcp
Source: https://mcpserve.com/servers/qdrant
qdrant-mcp-vector-engine is an official MCP (Model Context Protocol) server for Qdrant. It provides a semantic memory layer on top of the Qdrant vector search engine, enabling AI agents and chat platforms to store and retrieve information in a GDPR-compliant manner. It standardizes the interaction between AI models and Qdrant for knowledge retrieval and context enrichment.
qdrant-store for storing data (with support for metadata) and qdrant-find for semantic search within the Qdrant database.uvx and connects to any Qdrant instance.No pricing information provided.