The most reliable AI agent framework that supports MCP.
-
Updated
Mar 14, 2025 - Python
The most reliable AI agent framework that supports MCP.
LLM Agent Framework in ComfyUI includes MCP sever, Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai / aisuite interfaces, such as o1,ollama, gemini, grok, qwen, GLM, deepseek, kimi,doubao. Adapted to local llms, vlm, gguf such as llama-3.3 Janus-Pro, Linkage graphRAG
On-premises conversational RAG with configurable containers
Python and TypeScript library for integrating the Stripe API into agentic workflows
A middleware to provide an openAI compatible endpoint that can call MCP tools
A simple CLI to run LLM prompt and implement MCP client.
A zero-configuration tool for automatically exposing FastAPI endpoints as Model Context Protocol (MCP) tools.
A Model Context Protocol (MCP) server that enables secure interaction with MySQL databases
Connect to MCP servers that run on SSE transport, or expose stdio servers as an SSE server using the MCP Proxy server.
A Model Context Protocol (MCP) server implementation for DuckDB, providing database interaction capabilities
A Model Context Protocol (MCP) server that helps read GitHub repository structure and important files.
Run any AWS Lambda function as a Large Language Model (LLM) tool without code changes using Anthropic's Model Control Protocol (MCP).
A Model Context Protocol (MCP) server for interacting with Twitter.
Add a description, image, and links to the mcp topic page so that developers can more easily learn about it.
To associate your repository with the mcp topic, visit your repo's landing page and select "manage topics."