A CLI host application that allows Large Language Models (LLMs) to interact with external tools via the Model Context Protocol, serving as a host for MCP servers.
Loading more......
A specialized MCP server providing LLM enhancement prompts and jailbreaks with dynamic schema adaptation, designed to augment AI capabilities via the MCP protocol.
An implementation of the Model Context Protocol (MCP) server using Bun and the Elysia web framework, enabling high-performance MCP servers that expose resources, tools, and prompts to LLMs through a standardized interface.
Enables interactive LLM workflows by adding local user prompts and chat functionality directly into the MCP loop, enhancing the interactivity of MCP servers.
A Python-based MCP server for querying OpenAI models directly from Claude or other clients using the Model Context Protocol.
A Model Context Protocol (MCP) server that queries multiple Ollama models and combines their responses, offering diverse AI perspectives via a single MCP server endpoint.
An MCP server for integrating Raindrop.io bookmarks with LLMs using the Model Context Protocol.
A CLI host application that enables Large Language Models (LLMs) to interact with external tools via the Model Context Protocol (MCP).
~/.mcp.json by default), with support for custom paths.OLLAMA_HOST for Ollama base URL).No pricing information provided. MCPHost is open source and available under the MIT License.
cli llm-integration host open-source