Three AWS MCP Servers you should use today

Cloud | AWS | DevOps | AI 📍 Toronto 🇨🇦 🚀 Cloud Architect @ AWS 👨🏽🏫 Professor
In the rapidly evolving landscape of Generative AI, giving your LLM (Large Language Model) access to real-time data and specialized tools is the difference between a generic chatbot and a powerful AI assistant.
AWS has embraced the Model Context Protocol (MCP), providing a suite of servers that allow AI assistants to interact directly with AWS infrastructure, documentation, and architecture tools. Learn more about AWS MCP servers here.
What is MCP?
The Model Context Protocol (MCP) is an open-source standard, originally developed by Anthropic, that acts as a "universal translator" between AI models and external data sources.
Historically, connecting an AI to a specific tool required custom code, unique API integrations, and complex data parsing. MCP solves this by providing a standardized client-server architecture:
MCP Client: Your AI interface (e.g., Kiro, Claude Desktop or Amazon Q Developer).
MCP Server: A lightweight program that exposes specific tools or data (e.g., AWS Docs, GitHub, or a Database).
By using MCP, you can "plug in" expert capabilities to your AI assistant without writing any integration glue-code.
Connecting to AWS MCP Servers
Most AWS MCP servers are hosted on GitHub (awslabs/mcp) and can be run locally or via remote managed endpoints. To connect them, you generally follow these steps:
Install a Manager: Most servers require Python 3.10+ and the uv package manager.
Configure Credentials: Ensure your local environment has active AWS credentials (via
aws configureor environment variables) with the necessary IAM permissions. You only need this to work with your AWS environment.Update Client Config: Add the server details to your MCP client’s configuration file (usually
mcp.json).
Example Configuration Snippet
"mcpServers": {
"aws-documentation": {
"command": "uvx",
"args": ["awslabs.aws-documentation-mcp-server"]
}
}
AWS MCP Servers You Should Use Today
While AWS offers a growing list of servers (including servers for Lambda, S3, and CloudWatch), these three provide the most immediate value for developers and architects.
1. AWS Documentation MCP
The AWS Documentation MCP exposes AWS documentations via MCP. Instead of the LLM relying on its training data (which might be outdated), this server allows it to fetch the latest official AWS docs in real-time.
This will allow your AI assistant to programatically fetch and read up to date AWS documentations. With this, when learning AWS and troubleshooting any projects, your AI assistant always has access to up to date AWS information. With this you can prevent the AI from hallucinating old API syntax or deprecated service limits.
2. AWS Knowledge MCP
The AWS Knowledge MCP is a remote, fully-managed server designed for deep architectural research and regional service availability. It can quickly check which services or specific features (like a specific EC2 instance type) are available in a given AWS Region, provide full stack development guidance and access latest CDK and Cloudfromation documents for better IaC (Infra-as-code) development
3. AWS Diagram MCP
The AWS Diagram MCP allows your AI to act as a cloud architect. It bridges the gap between a text-based conversation and a visual architecture diagram. It uses the Python diagrams package DSL to generate professional-looking architecture images. It can generate AWS architecture diagrams, sequence diagrams, and flowcharts. You can ask the AI to "draw a highly available web app" and it will generate the code and render the image for you to review.
The shift toward MCP marks a transition from AI assistants "that talk about AWS" to "work on or with AWS." By integrating these three servers you create a comprehensive environment where your AI assistant can research, plan, and visualize your cloud infrastructure in one continuous workflow.





