Show HN: MCP-123, a 2-line MCP server/client (Windows-friendly)
10 months ago
- #FastMCP
- #OpenAI
- #MCP
- Ultra-minimal setup: Start a server or client in 2 lines.
- Easy tool creation: Write normal functions in tools.py—no decorators or special wrappers needed—and they get included as tools.
- OpenAI integration: The client uses your OpenAI API key to answer questions, calling tools as needed.
- Server: Loads all top-level functions from tools.py and exposes them as MCP tools via HTTP.
- Client: Discovers available tools, sends prompts to OpenAI, and automatically calls tools if needed.
- Python 3.11+ required.
- OpenAI API key needed for the client.
- Zero boilerplate: No need to write schemas or wrappers—just write functions.
- LLM-native: Designed for seamless LLM tool use.
- Extensible: Add more tools by simply adding functions.
- Built with FastMCP.
- Inspired by the Model Context Protocol (MCP).
- Pull requests and issues are welcome, but only if they are in ALL-CAPS.