What is the best way to create an MCP server using Python and Docker?
- amareshpat
- Oct 22
- 7 min read
Model Context Protocol (MCP) is emerging as the next-generation bridge between AI models and external systems. Instead of relying on traditional REST endpoints or ad-hoc tool integrations, MCP provides a structured, JSON-RPC–based protocol that allows AI clients (such as LLM agents or copilots) to dynamically discover and call tools, retrieve resources, and use prompts defined by the server — all with standardized semantics and security controls.
In this post, we’ll explore how to build a fully functional MCP server in Python, package it inside Docker, and expose a few real-world tools that query live stock-market data — all while adhering to the principles of the Model Context Protocol.
We offer great AI courses for engineers at various levels of experience.
What Is MCP and Why JSON-RPC?
At its core, Model Context Protocol is a lightweight, bi-directional protocol for connecting large language models to contextual data and capabilities.
Instead of HTTP endpoints, MCP servers use JSON-RPC — a message-based communication style where the client sends structured requests (methods + params) and receives structured JSON responses. This brings several advantages over REST for AI-tool integration:
Schema consistency: Each method (tool) has explicit input/output definitions.
Bi-directional communication: Supports streaming, async calls, and event notifications.
Low overhead: Simple JSON messages over HTTP, WebSocket, or even stdin/stdout.
Tool discovery: Clients can query available tools and resources automatically.
In short, JSON-RPC is a better match for the structured, tool-oriented nature of AI-agent ecosystems.
Our Example: A Stock Data MCP Server
Let’s design an MCP server that provides three stock-related tools and some supportive prompts and resources:
Category | Name | Description |
Tools | check_stock_price | Return latest stock price for a ticker |
get_company_details | Return company name, industry, summary | |
get_top_news | Return top 5 news articles for the company | |
Prompts | summarize_company | Summarize company profile and market context |
Resources | help_doc | Markdown resource describing how to use the MCP server |
⚙️ Technology Stack
Python 3.11+
FastMCP — a Python framework for building MCP servers (similar to FastAPI but for JSON-RPC/MCP)
yfinance — fetch stock price and company metadata
NewsAPI — fetch top news articles
Docker — containerization
Uvicorn — ASGI server backend for running FastMCP
📁 Project Structure
mcp_stock_server/
├── app.py
├── tools/
│ ├── stock_price_tool.py
│ ├── company_tool.py
│ └── news_tool.py
├── prompts/
│ └── summarize_company.py
├── resources/
│ └── help_doc.md
├── requirements.txt
└── Dockerfile
requirements.txt
fastmcp
uvicorn[standard]
yfinance
newsapi-python
🛠 Implementing MCP Tools
Each tool in MCP corresponds to a JSON-RPC method.FastMCP makes this simple through decorators and type annotations.
1️⃣ Stock Price Tool
tools/stock_price_tool.py
import yfinance as yf
from fastmcp import Tool, Schema
class StockPriceInput(Schema):
ticker: str
class StockPriceOutput(Schema):
ticker: str
price: float
currency: str
@Tool("check_stock_price", input_model=StockPriceInput, output_model=StockPriceOutput)
async def check_stock_price(params: StockPriceInput) -> StockPriceOutput:
t = yf.Ticker(params.ticker)
info = t.info
return StockPriceOutput(
ticker=params.ticker,
price=info.get("regularMarketPrice", 0.0),
currency=info.get("currency", "USD"),
)
2️⃣ Company Details Tool
tools/company_tool.py
import yfinance as yf
from fastmcp import Tool, Schema
class CompanyInput(Schema):
ticker: str
class CompanyOutput(Schema):
ticker: str
name: str
industry: str
summary: str
@Tool("get_company_details", input_model=CompanyInput, output_model=CompanyOutput)
async def get_company_details(params: CompanyInput) -> CompanyOutput:
t = yf.Ticker(params.ticker)
info = t.info
return CompanyOutput(
ticker=params.ticker,
name=info.get("shortName", ""),
industry=info.get("industry", ""),
summary=info.get("longBusinessSummary", ""),
)
3️⃣ Company News Tool
tools/news_tool.py
from newsapi import NewsApiClient
import os
from fastmcp import Tool, Schema
NEWS_API_KEY = os.getenv("NEWSAPI_KEY")
client = NewsApiClient(api_key=NEWS_API_KEY)
class NewsInput(Schema):
ticker: str
top_n: int = 5
class NewsArticle(Schema):
title: str
url: str
published_at: str
source: str
class NewsOutput(Schema):
ticker: str
articles: list[NewsArticle]
@Tool("get_top_news", input_model=NewsInput, output_model=NewsOutput)
async def get_top_news(params: NewsInput) -> NewsOutput:
resp = client.get_everything(q=params.ticker, sort_by="publishedAt", page_size=params.top_n)
articles = [
NewsArticle(
title=a["title"],
url=a["url"],
published_at=a["publishedAt"],
source=a["source"]["name"]
)
for a in resp.get("articles", [])[:params.top_n]
]
return NewsOutput(ticker=params.ticker, articles=articles)
💬 Adding Prompts
Prompts in MCP help guide the model or user on how to use tools together.For example, a “summarize_company” prompt could describe how to compose tool outputs.
prompts/summarize_company.py
from fastmcp import Prompt
summarize_company = Prompt(
name="summarize_company",
description="Generate a short summary of the company including its industry, stock price, and recent news.",
template="""
You are a financial analyst. Given company details, current stock price, and recent top news,
write a concise paragraph summarizing the company's current market position and sentiment.
Example usage:
1. Call `get_company_details` with the ticker.
2. Call `check_stock_price` for latest price.
3. Call `get_top_news` to get top 5 news articles.
4. Combine and summarize.
Output format:
A one-paragraph summary suitable for a user query like "Summarize Apple".
"""
)
📚 Adding Resources
Resources are static pieces of information or documentation that help users understand your MCP server.
resources/help_doc.md
# MCP Stock Server Help
Welcome to the Stock Data MCP Server!
Available tools:
- `check_stock_price`: Get current stock price by ticker
- `get_company_details`: Retrieve company name, industry, summary
- `get_top_news`: Get top 5 latest news articles for the company
Available prompts:
- `summarize_company`: Example prompt combining tools for market summaries
Usage Example:
client.call("check_stock_price", {"ticker": "AAPL"})
🚀 Putting It All Together
from fastmcp import MCPServer
from tools.stock_price_tool import check_stock_price
from tools.company_tool import get_company_details
from tools.news_tool import get_top_news
from prompts.summarize_company import summarize_company
import pathlib
# Create MCP server instance
mcp = MCPServer(name="Stock MCP Server", description="Provides stock info tools via Model Context Protocol")
# Register tools
mcp.register_tool(check_stock_price)
mcp.register_tool(get_company_details)
mcp.register_tool(get_top_news)
# Register prompt
mcp.register_prompt(summarize_company)
# Register resource (static markdown)
resource_path = pathlib.Path("resources/help_doc.md")
mcp.register_resource("help_doc", resource_path)
if __name__ == "__main__":
# Run server (JSON-RPC over HTTP)
mcp.run(host="0.0.0.0", port=8000)
This starts a JSON-RPC MCP server listening on port 8000.Clients can connect using any MCP-compatible client or SDK (for example, a Copilot or agent framework that supports MCP).
🐳 Dockerizing the MCP Server
Dockerfile
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
ENV NEWSAPI_KEY=<your_api_key_here>
EXPOSE 8000
CMD ["python", "app.py"]
Then build and run:
docker build -t mcp-stock-server:latest .
docker run -d -p 8000:8000 \
-e NEWSAPI_KEY=$NEWSAPI_KEY \
--name mcp-stock mcp-stock-server:latest
🔍 Testing the Server with JSON-RPC
You can send JSON-RPC requests using curl or any client library.
Example 1 – Get Stock Price
curl -X POST http://localhost:8000 \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"method": "check_stock_price",
"params": {"ticker": "AAPL"},
"id": 1
}'
Response:
{
"jsonrpc": "2.0",
"id": 1,
"result": {
"ticker": "AAPL",
"price": 230.4,
"currency": "USD"
}
}
Example 2 – List Tools and Prompts
Many MCP clients support a list_tools or list_prompts discovery call.FastMCP automatically provides metadata for each registered element, so clients can self-discover capabilities.
🧱 Why Use FastMCP?
The FastMCP library brings to MCP what FastAPI brought to REST: simplicity, type safety, and developer productivity.
✅ Declarative tool registration – define JSON-RPC methods with typed schemas
✅ Automatic discovery – tools, prompts, and resources are self-describing
✅ Async support – ideal for I/O-bound operations like API calls
✅ Integrated schema validation – requests and responses are type-checked
✅ Plug-and-play with LLM frameworks – many GenAI stacks can directly call MCP servers
Under the hood, FastMCP runs on an ASGI server (like Uvicorn) and implements the MCP handshake, discovery, and JSON-RPC semantics.
🧠 Design Philosophy and Best Practices
1. Separate Logic from Protocol
Each tool’s business logic (e.g., fetching stock data) should be isolated from the MCP-specific definitions. This makes it testable and reusable.
2. Clear Input/Output Schemas
Defining explicit Pydantic/Schema classes ensures the model knows what inputs and outputs to expect — critical for AI tool calling.
3. Provide Helpful Prompts and Resources
Prompts serve as meta-instructions for the AI client.Resources give human users and developers a quick way to explore capabilities.This combination of Tools + Prompts + Resources is what makes MCP self-descriptive.
4. Secure Secrets
Never hardcode API keys in the Docker image. Use environment variables or container-secret management (AWS Secrets Manager, GCP Secret Manager, etc.).
5. Enable Observability
Add request logging, latency metrics, and error tracing to observe how the MCP server performs under real load.
6. Versioning and Extensibility
Include version information in the MCP handshake so clients can adapt when tools evolve.
🔄 Scaling and Extending
Once you have this working, the architecture can easily scale:
Multiple containers behind a load balancer (for concurrency and failover)
Add caching via Redis to reduce API calls
Add more tools, e.g., get_historical_prices, get_market_sentiment, etc.
Expose streaming results (FastMCP supports async streaming responses)
Integrate with AI copilots — for instance, a local ChatGPT or LangChain agent can register this MCP server as an external tool.
For cloud deployment, the same Docker image can run on:
AWS ECS / Fargate
Google Cloud Run
Kubernetes
Azure Container Apps
🧭 How an AI Copilot Uses This MCP Server
Here’s what a typical interaction looks like:
The AI client connects to the MCP server and discovers available tools.
When the user asks, “What’s Tesla’s stock price and recent news?”,the AI model reasons and issues two JSON-RPC calls:
check_stock_price(ticker="TSLA")
get_top_news(ticker="TSLA")
The results are combined by the model and optionally fed into the summarize_company prompt template to generate a final answer.
The MCP protocol ensures this workflow is transparent, stateless, and reproducible.
This approach yields a much cleaner and safer way for AI systems to interact with real-world data sources compared to scraping or informal REST calls.
🧩 Summary
Building an MCP server with Python + FastMCP + Docker gives you:
Feature | Benefit |
JSON-RPC protocol | Structured, bidirectional communication for AI clients |
FastMCP library | Simple decorators and schema validation |
Tools / Prompts / Resources | Full self-describing context for LLMs |
Docker containerization | Portable, scalable deployment |
Stock data use case | Realistic demonstration of financial APIs in MCP |
🧠 Final Thoughts
MCP is poised to become a foundational layer of agentic AI systems, where every data source or capability can be represented as a discoverable, typed “tool”.
Using FastMCP makes the developer experience intuitive, much like building REST APIs in FastAPI.By containerizing the server with Docker, you get reproducibility and scalability out of the box.
With this structure — clear tools, descriptive prompts, helpful resources — you’re not just creating an API; you’re creating a smart, discoverable context provider that any AI system can plug into.
So the next time you want your AI assistant to “fetch stock data” or “summarize company trends”, you won’t need custom code — you’ll have a ready-made MCP server doing it cleanly, securely, and beautifully.



This is a great post!