Minh-Phuc Tran, a prominent voice in the technology sphere, has recently highlighted a perceived limitation within the Model Context Protocol (MCP), stating, "The biggest problem of MCP is composability." In a tweet, Tran elaborated, "HTTP & React are great because they're highly composable, making it easy to build on top of each other. But MCP servers only talk to MCP clients, they don't talk to other servers. MCP clients can't talk to other MCP clients, either." This observation points to a design choice within MCP that prioritizes a client-server relationship over direct peer-to-peer communication between servers or clients.
The Model Context Protocol, open-sourced by Anthropic, is designed to standardize how AI applications interact with external data sources and tools. It operates on a client-server architecture where an AI application, acting as an MCP "host," establishes one-to-one connections with multiple MCP clients. Each client then maintains a dedicated connection to a specific MCP server, which exposes capabilities like tools, resources, and prompts. This structure allows a single AI application to orchestrate complex workflows by leveraging functionalities from various external services.
While MCP's design means servers do not directly communicate with other servers, nor do clients directly communicate with other clients, the host application manages the overall interaction and can chain operations across different servers. For instance, an AI agent in an Integrated Development Environment (IDE) could use one MCP server to access a file system and another to interact with a version control system, with the IDE coordinating these actions. This centralized orchestration by the host is a key aspect of MCP's approach to composability.
Despite the architectural point raised by Tran, MCP has seen rapid adoption since its open-sourcing in late 2024, with its momentum accelerating significantly in early 2025. Major platforms and tools, including Microsoft Copilot Studio, Amazon Bedrock Agents, and various IDEs like Cursor, have integrated or announced support for MCP. This widespread embrace underscores the industry's need for a standardized protocol to connect large language models (LLMs) with the vast ecosystem of enterprise data and applications, moving beyond custom, one-off integrations.
The protocol's benefits include unified integration, reduced development time due to standardized patterns, and clear separation of concerns between data access and computation. Recent updates to the MCP specification have also addressed initial concerns around security and authentication, with the inclusion of a comprehensive OAuth 2.1 framework. As the ecosystem matures, developers are building a growing number of MCP servers for popular services like GitHub, Slack, and Google Drive, further expanding the protocol's reach and practical applications.