MCP server to access and manage LLM application prompts created with Langfuse Prompt Management.
Loading more......
A Model Context Protocol implementation for managing and serving AI prompts with a TypeScript architecture, showcasing MCP server capabilities for AI prompt management.
A flexible, template-based prompt system server for Claude models, enabling standardized interactions and complex reasoning workflows through a TypeScript/Node.js MCP server with API support. Highly relevant as an MCP server for prompt management.
An MCP server that enables multiple LLMs to share and analyze responses for collaborative problem-solving and multi-perspective analysis using the Model Context Protocol. Directly relevant as an MCP server.
A Model Context Protocol (MCP) server implementation for the Google Gemini language model, allowing Claude Desktop users to access the reasoning capabilities of Gemini-2.0-flash-thinking-exp-01-21.
An MCP server for interacting with the Perplexity API, providing AI assistants with access to Perplexity's capabilities via MCP.
Implements a step-by-step reasoning framework for LLMs, supporting structured problem-solving and complex task execution. Includes implementation by hemangjoshi37a.
Description: MCP server to access and manage LLM application prompts created with Langfuse Prompt Management. It implements the Model Context Protocol (MCP) to allow prompt discovery, retrieval, and compilation for use in LLM applications.
Source: https://github.com/langfuse/mcp-server-langfuse
Category: ai-integration-mcp-servers
Tags: mcp, langfuse, prompt-management, llm, ai-integration
prompts/list endpoint to list all available prompts with optional cursor-based pagination.prompts/get endpoint to retrieve a specific prompt.get-prompts and get-prompt tools to replicate MCP Prompts functionality for clients without prompt capability.No pricing information is provided; the project is open source under the MIT license.
MIT License