3 min read
MCP API Bridge

Status: Released · MIT licensed · View on GitHub · Documentation site

A production-quality starter kit that wraps any REST API as an MCP server, so Claude, Cursor, and other AI assistants can call it directly through typed, validated tools. Ships with 4 working tools against JSONPlaceholder as a reference implementation. The real value is the pattern.

Problem

The Model Context Protocol lets AI assistants call external tools, but building an MCP server from scratch means solving the same problems every time: input validation, error handling, response formatting, test infrastructure, and Claude Desktop configuration. Developers who want to expose their own APIs to AI assistants need a working reference, not just documentation.

Approach

MCP API Bridge provides a complete, tested implementation that developers can fork and adapt. The architecture separates the HTTP client layer from the MCP tool layer. When adapting for a new API, you primarily modify the client and the Pydantic models while the MCP wiring stays the same.

The starter kit demonstrates all the patterns that production MCP servers need: Pydantic v2 input validation with field constraints so the AI assistant sees the schema and knows exactly what to send, dual response formats (markdown for human-readable output, JSON for machine consumption), correct MCP annotations (readOnlyHint, destructiveHint, idempotentHint, openWorldHint) on every tool, and proper error propagation.

What’s Included

  • 4 working MCP tools: api_list_posts (GET with filtering + pagination), api_get_post (resource lookup with optional related data), api_create_post (write with validation), api_update_post (partial updates with existence checks)
  • Pydantic v2 input models: Every tool parameter is typed with constraints, generating JSON Schema that AI assistants use for structured tool calls
  • 74 tests: Unit tests with pytest-httpx mocking, tool-level tests, and end-to-end tests over the live MCP protocol against JSONPlaceholder
  • Claude Desktop config: Drop-in claude_desktop_config.json for immediate local use
  • 3-step adaptation guide: Replace the API client, define your Pydantic models, register your tools

Architecture

Built with FastMCP (the MCP Python SDK’s high-level framework) using stdio transport, the right default for local-first MCP servers, with a one-line switch to streamable_http for remote deployment. The project uses standard Python packaging (PEP 621 via pyproject.toml) and requires Python 3.10+.

The separation between api_client.py (HTTP layer) and server.py (MCP tools) is deliberate. This clean boundary means you can swap the entire API backend without touching any MCP code, and vice versa.

What This Demonstrates

Production MCP server architecture with proper validation, annotation, and testing patterns. Designed as both a reference implementation for the MCP community and a practical starting point for teams that need to expose internal APIs to AI assistants.

Need an MCP server built for your API? Get in touch.