mcp-agent is a simple, composable framework for building AI agents using the Model Context Protocol (MCP). It implements Anthropic’s Building Effective Agents patterns in a model-agnostic way, allowing you to create robust agent applications that can leverage any MCP-compatible tools and services.

What is mcp-agent?

mcp-agent bridges the gap between the low-level Model Context Protocol and high-level agent patterns. It handles the complex mechanics of managing MCP server connections, tool orchestration, and workflow patterns so you can focus on building your application logic.

MCP Integration

Seamless connection to any MCP server with tools, resources, prompts, and file system roots

Agent Patterns

Pre-built implementations of proven agent workflow patterns

Model Agnostic

Works with OpenAI, Anthropic, Azure, AWS Bedrock, Google, and Ollama

Composable

Chain workflow patterns together for complex behaviors

Human-in-the-Loop

Built-in support for human input and workflow signaling

Observable

Comprehensive logging and telemetry support

Why mcp-agent?

The MCP Ecosystem

As more services become MCP-aware, you can use mcp-agent to build agents that leverage these tools out-of-the-box. mcp-agent is the only framework purpose-built for the MCP protocol, making it the simplest way to build robust agent applications.

Proven Patterns

mcp-agent implements every pattern from Anthropic’s research on building effective agents:

Quick Example

Here’s a simple agent that can read files and fetch web content:

import asyncio
from mcp_agent.app import MCPApp
from mcp_agent.agents.agent import Agent
from mcp_agent.workflows.llm.augmented_llm_openai import OpenAIAugmentedLLM

app = MCPApp(name="finder_agent")

async def main():
    async with app.run() as mcp_agent_app:
        # Create an agent with access to fetch and filesystem servers
        finder_agent = Agent(
            name="finder",
            instruction="You can read local files or fetch URLs. Return the requested information when asked.",
            server_names=["fetch", "filesystem"]
        )

        async with finder_agent:
            # Attach an LLM to the agent
            llm = await finder_agent.attach_llm(OpenAIAugmentedLLM)

            # Ask the agent to read a file
            result = await llm.generate_str("Show me what's in README.md")
            print(result)

            # Multi-turn conversation
            result = await llm.generate_str("Summarize that in a tweet")
            print(result)

if __name__ == "__main__":
    asyncio.run(main())

What You’ll Learn

1

Get Started

Install mcp-agent and create your first agent
2

Core Concepts

Understand agents, MCP servers, and Augmented LLMs

3

MCP Servers

Work with tools, resources, prompts, and roots

4

Workflow Patterns

Implement sophisticated agent behaviors
5

Integrations

Build applications with Streamlit, Claude Desktop, and more

6

Advanced Features

Composition, Monitoring and Temporal

Community and Support

Ready to Get Started?

Let’s install mcp-agent and build your first agent!