What Is MCP?
MCP (Model Context Protocol) is a standard that allows AI models to interact with external tools in a structured and reliable way.
Instead of the model guessing how to call APIs, MCP:
Clearly defines available tools
Defines required parameters
Allows safe execution
Returns structured responses
Think of MCP as:
A universal contract between AI models and your backend systems.
Why MCP Is Important
Without MCP:
You parse natural language manually
You extract intent manually
You map to backend functions manually
You handle errors manually
With MCP:
Tools are structured
Parameters are validated
AI understands what it can use
Execution becomes standardized
High-Level Architecture
Hereโs how the system works:
User โ AI Model โ MCP Client โ MCP Server โ Your Backend Logic
๐น MCP Server (What Weโll Build)
Defines tools
Validates inputs
Executes logic
Returns structured results
๐น MCP Client
Connects AI to MCP server
Sends tool calls
Returns results
๐น AI Model
Decides which tool to use
Supplies parameters
What Weโll Build
Weโll build a simple Python MCP Server with:
add_numbersget_weatherget_current_time
It will:
Expose tool metadata
Accept tool execution requests
Return structured JSON responses
Setup Environment
Install Python 3.9+
Check version:
python --version
Create Project
mkdir python-mcp-server
cd python-mcp-server
Create Virtual Environment (Recommended)
python -m venv venv
source venv/bin/activate # Mac/Linux
venv\Scripts\activate # Windows
Install Dependencies
Weโll use:
FastAPI (for API server)
Uvicorn (for running server)
Pydantic (for validation)
pip install fastapi uvicorn
Create MCP Server File
Create:
server.py
Step 1 โ Basic FastAPI Setup
Add:
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from typing import Dict, Any
import datetime
app = FastAPI(title="Python MCP Server")
Run server:
uvicorn server:app --reload
Visit:
http://127.0.0.1:8000/docs
FastAPI automatically gives you API documentation ๐
Step 2 โ Define MCP Tools Metadata
MCP tools must include:
name
description
parameters schema
Add this below your app definition:
tools = [
{
"name": "add_numbers",
"description": "Add two numbers together",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number", "description": "First number"},
"b": {"type": "number", "description": "Second number"}
},
"required": ["a", "b"]
}
},
{
"name": "get_weather",
"description": "Get current weather by city name",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "City name"}
},
"required": ["city"]
}
},
{
"name": "get_current_time",
"description": "Get current server time",
"parameters": {
"type": "object",
"properties": {}
}
}
]
Step 3 โ Expose Tools Endpoint
Add:
@app.get("/tools")
def list_tools():
return tools
Now open:
http://127.0.0.1:8000/tools
Youโll see available tools.
Step 4 โ Define Execution Request Model
We need a structured request body.
Add:
class ExecuteRequest(BaseModel):
name: str
arguments: Dict[str, Any]
Step 5 โ Implement Tool Execution Logic
Add this route:
@app.post("/execute")
def execute_tool(request: ExecuteRequest):
name = request.name
args = request.arguments
try:
if name == "add_numbers":
a = args.get("a")
b = args.get("b")
if a is None or b is None:
raise HTTPException(status_code=400, detail="Missing parameters")
return {"result": a + b}
elif name == "get_weather":
city = args.get("city")
if not city:
raise HTTPException(status_code=400, detail="City is required")
# Fake weather data
return {"result": f"The weather in {city} is 25ยฐC and sunny."}
elif name == "get_current_time":
now = datetime.datetime.now()
return {"result": now.isoformat()}
else:
raise HTTPException(status_code=400, detail="Tool not found")
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
Testing the MCP Server
Using curl:
curl -X POST http://127.0.0.1:8000/execute \
-H "Content-Type: application/json" \
-d '{"name":"add_numbers","arguments":{"a":10,"b":5}}'
Response:
{
"result": 15
}
Your MCP server works ๐
How AI Uses This Server
When connected to an AI model:
Model reads
/toolsUser asks:
โWhat time is it?โModel selects:
get_current_time({})MCP Client sends request to
/executeServer responds
Model formats response for user
Improving the Server (Best Practices)
โ Add Proper Validation
Use Pydantic models per tool instead of manual .get() checks.
โ Separate Logic From Routing
Better structure:
/tools/
weather.py
math.py
main.py
โ Add Logging
import logging
logging.basicConfig(level=logging.INFO)
โ Add Authentication
For production, add:
API keys
JWT validation
OAuth
โ Rate Limiting
Use:
pip install slowapi
Connecting to Real APIs
Replace fake weather with real API:
import requests
response = requests.get("https://api.weatherapi.com/...")
data = response.json()
Return structured data instead of plain text.
Production Deployment
Using Gunicorn
pip install gunicorn
gunicorn -w 4 -k uvicorn.workers.UvicornWorker server:app
Using Docker
Example Dockerfile:
FROM python:3.10
WORKDIR /app
COPY . .
RUN pip install fastapi uvicorn
CMD ["uvicorn", "server:app", "--host", "0.0.0.0", "--port", "8000"]
Common Beginner Mistakes
โ Not validating parameters
โ Returning plain text instead of structured JSON
โ Putting all logic in one giant function
โ No error handling
โ No security layer
Understanding the Big Picture
MCP servers are the foundation for:
AI agents
Enterprise AI systems
Autonomous workflows
AI copilots
Internal automation
When you build an MCP server, you are basically making your backend โAI-readyโ.
๐ฏ Final Summary
Component | Role |
|---|---|
MCP Server | Hosts and executes tools |
Tool | Structured callable function |
Parameters | JSON schema definition |
Execute endpoint | Runs tool logic |
AI Model | Chooses when to call tools |
๐ Congratulations
You now know how to:
Build a Python MCP Server
Define structured tools
Execute tool calls
Return safe results
Prepare for AI integration



Discussion
Responses
No comments yet. Be the first to add one.