Building Your AI Control Plane with FastMCP

In 2026, the FastMCP framework is the industry standard for building MCP servers quickly. It handles all the protocol “handshaking” automatically, so you can focus on the Linux/Docker tools you want to give the AI.

Here is a starter template for an MCP server that allows an AI to list Docker containers and check resource usage.


1. The Python MCP Server (server.py)

This script uses the FastMCP library to create two tools: list_containers and container_stats.

Python

from fastmcp import FastMCP
import docker
# Initialize MCP Server
mcp = FastMCP("DockerOps-Assistant 🐳")
client = docker.from_env()
@mcp.tool()
def list_containers(all: bool = False) -> str:
"""
Lists all running Docker containers.
Set 'all' to True to see stopped containers as well.
"""
try:
containers = client.containers.list(all=all)
if not containers:
return "No containers found."
result = "Current Containers:\n"
for c in containers:
result += f"- {c.name} (Status: {c.status}, Image: {c.image.tags})\n"
return result
except Exception as e:
return f"Error connecting to Docker: {str(e)}"
@mcp.tool()
def container_stats(container_name: str) -> str:
"""
Returns the CPU and Memory usage for a specific container.
"""
try:
container = client.containers.get(container_name)
stats = container.stats(stream=False)
cpu = stats['cpu_stats']['cpu_usage']['total_usage']
mem = stats['memory_stats']['usage']
return f"Stats for {container_name}:\n- CPU Usage: {cpu}\n- Memory Usage: {mem} bytes"
except Exception as e:
return f"Could not find container '{container_name}': {str(e)}"
if __name__ == "__main__":
mcp.run()

2. How to “Plug It In” (Claude or VS Code)

To let your AI assistant use this server, you need to add it to your configuration file (usually located at ~/Library/Application Support/Claude/claude_desktop_config.json).

JSON

{
"mcpServers": {
"docker-ops": {
"command": "python",
"args": ["/path/to/your/server.py"],
"env": {
"DOCKER_HOST": "unix:///var/run/docker.sock"
}
}
}
}

3. Why this is a “Support Pro” Move

By setting this up, you aren’t just an “admin” anymore; you are building the AI Control Plane for the company.

  • The Benefit: Instead of you manually running docker ps or top and reporting back, the company’s AI can do it.
  • The “Safety” Pitch: You can explain that this server only has “Read-Only” access. It can’t delete or stop containers—it can only report on their health. This makes it a safe way to give stakeholders visibility without giving them destructive power.

4. Taking it to AKS

Once you’ve tested this locally, your next step is to deploy it to AKS.

  1. Dockerize it: Wrap the script in a lightweight Python image.
  2. Deploy as a Pod: Deploy it to the cluster.
  3. Permissions: Use the Workload Identity we set up earlier to give the pod permission to query the Kubernetes API.

Leave a comment