An MCP server to share code context with LLMs, supporting clipboard and protocol-based interactions.
Loading more......
A server that enables Claude Desktop users to access the Claude API directly, bypass Professional Plan limitations, and use advanced features like custom system prompts and conversation management. A practical MCP server for enhanced Claude integration.
An MCP server enhancing Claude's context management, knowledge organization, and conversation thread storage across sessions. Relevant as a robust MCP server solution for persistent state and project context organization.
An MCP server that enhances AI agent reasoning by integrating the 'think-tools' as described by Anthropic, boosting AI context management via MCP.
A desktop application that acts as an MCP host, allowing users to connect and interact with various MCP servers. Essential for testing and using MCP servers, making it highly relevant for the Awesome MCP Servers ecosystem.
A Model Context Protocol (MCP) server designed for Unity3d Game Engine integration, allowing seamless AI-assisted development and tooling within Unity projects.
Flipt’s MCP server allows AI assistants and LLMs to interact with feature flags, segments, and evaluations via a standardized MCP interface. An example of a Model Context Protocol server for feature flag management.
Description:
llm-context.py is an open-source MCP server and CLI tool that enables developers to share code context with Large Language Models (LLMs). It supports both Model Context Protocol (MCP) integration and clipboard-based workflows, making it easy to switch between different tasks such as code review and documentation. The tool provides smart code outlining and rule-based customization for flexible context sharing.
Source: GitHub - cyberchitta/llm-context.py
Category: Development Tools / MCP Servers
Tags: llm, context-management, ai-integration, open-source
lc-init: Initialize project configurationlc-set-rule <n>: Switch ruleslc-sel-files: Select files for inclusionlc-sel-outlines: Select files for outline generationlc-context: Generate and copy context (with options for prompts and user notes)lc-prompt: Generate project instructions for LLMslc-clip-files: Process LLM file requestslc-changed: List files modified since last context generationlc-outlines: Generate outlines for code fileslc-clip-implementations: Extract code implementations requested by LLMs (not for C/C++)llm-context.py is open source and free to use under the Apache-2.0 license.