MCP
基本介绍
官方地址:
- https://modelcontextprotocol.io/introduction
"MCP 是一种开放协议,旨在标准化应用程序向大型语言模型(LLM)提供上下文的方式。可以把 MCP 想象成 AI 应用程序的 USB-C 接口。就像 USB-C 提供了一种标准化的方式,让你的设备能够连接各种外设和配件一样,MCP 也提供了一种标准化的方式,让 AI 模型能够连接不同的数据源和工具。" 

● MCP 主机(MCP Hosts):像 Claude Desktop、IDE 或 AI 工具等程序,它们希望通过 MCP 访问数据。
● MCP 客户端(MCP Clients):维护与服务器 1:1 连接的协议客户端。
● MCP 服务器(MCP Servers):轻量级程序,它们通过标准化的模型上下文协议(Model Context Protocol)公开特定的功能。
● 本地数据源(Local Data Sources):你的计算机上的文件、数据库和服务,MCP 服务器可以安全地访问这些数据。
● 远程服务(Remote Services):通过互联网可用的外部系统(例如 API),MCP 服务器可以与其连接。
https://www.anthropic.com/news/model-context-protocol
测试项目
服务端
使用刚才安装的 uv 创建一个新的项目,并启动虚拟环境
            
            
              shell
              
              
            
          
          uv init weather
cd weather
uv venv
source .venv/bin/activate
        操作结果如下:

安装依赖
            
            
              shell
              
              
            
          
          uv add "mcp[cli]" httpx
        对应如下:

编写一个 server:
            
            
              python
              
              
            
          
          from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("weather")
NWS_API_BASE = "https://api.weather.gov"
USER_AGENT = "weather-app/1.0"
# 这里是用来发起请求的
async def make_nws_request(url: str) -> dict[str, Any] | None:
    """Make a request to the NWS API with proper error handling."""
    headers = {
        "User-Agent": USER_AGENT,
        "Accept": "application/geo+json"
    }
    async with httpx.AsyncClient() as client:
        try:
            response = await client.get(url, headers=headers, timeout=30.0)
            response.raise_for_status()
            return response.json()
        except Exception:
            return None
# 模板格式 将结果对应的内容 补充进模板
def format_alert(feature: dict) -> str:
    """Format an alert feature into a readable string."""
    props = feature["properties"]
    return f"""
    Event: {props.get('event', 'Unknown')}
    Area: {props.get('areaDesc', 'Unknown')}
    Severity: {props.get('severity', 'Unknown')}
    Description: {props.get('description', 'No description available')}
    Instructions: {props.get('instruction', 'No specific instructions provided')}
    """
# 通过 mcp.tool 标识功能
# 根据 区域名称 进行查询
# 这里主要是拼接URL,并且对URL发起了请求make_nws_request
# 最后将返回结果放进format_alert格式化函数
@mcp.tool()
async def get_alerts(state: str) -> str:
    """Get weather alerts for a US state.
    Args:
        state: Two-letter US state code (e.g. CA, NY)
    """
    url = f"{NWS_API_BASE}/alerts/active/area/{state}"
    data = await make_nws_request(url)
    if not data or "features" not in data:
        return "Unable to fetch alerts or no alerts found."
    if not data["features"]:
        return "No active alerts for this state."
    alerts = [format_alert(feature) for feature in data["features"]]
    return "\n---\n".join(alerts)
# 通过 mcp.tool 标识功能
# 根据 lng lat 进行查询
# 主要功能类似于 get_alerts
@mcp.tool()
async def get_forecast(latitude: float, longitude: float) -> str:
    """Get weather forecast for a location.
    Args:
        latitude: Latitude of the location
        longitude: Longitude of the location
    """
    # First get the forecast grid endpoint
    points_url = f"{NWS_API_BASE}/points/{latitude},{longitude}"
    points_data = await make_nws_request(points_url)
    if not points_data:
        return "Unable to fetch forecast data for this location."
    # Get the forecast URL from the points response
    forecast_url = points_data["properties"]["forecast"]
    forecast_data = await make_nws_request(forecast_url)
    if not forecast_data:
        return "Unable to fetch detailed forecast."
    # Format the periods into a readable forecast
    periods = forecast_data["properties"]["periods"]
    forecasts = []
    for period in periods[:5]:  # Only show next 5 periods
        forecast = f"""
        {period['name']}:
        Temperature: {period['temperature']}°{period['temperatureUnit']}
        Wind: {period['windSpeed']} {period['windDirection']}
        Forecast: {period['detailedForecast']}
        """
        forecasts.append(forecast)
    return "\n---\n".join(forecasts)
# 启动函数
if __name__ == "__main__":
    # Initialize and run the server
    mcp.run(transport='stdio')
        客户端
继续新建一个项目:
            
            
              shell
              
              
            
          
          uv init mcp-client
cd mcp-client
uv venv
source .venv/bin/activate
uv add mcp openai python-dotenv
        对应结果如下:

编写一个 client 代码:
            
            
              python
              
              
            
          
          import asyncio
from typing import Optional
from contextlib import AsyncExitStack
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
from openai import OpenAI
from dotenv import load_dotenv
import json
load_dotenv()
class MCPClient:
    def __init__(self):
        # 初始化的参数配置在这里
        self.session: Optional[ClientSession] = None
        self.exit_stack = AsyncExitStack()
        self.openai = OpenAI()
        self.model = "gpt-4o-mini"
    # 连接到服务器
    async def connect_to_server(self, server_script_path: str):
        """Connect to an MCP server
        Args:
            server_script_path: Path to the server script (.py or .js)
        """
        # 为了兼容 py 和 js
        is_python = server_script_path.endswith('.py')
        is_js = server_script_path.endswith('.js')
        if not (is_python or is_js):
            raise ValueError("Server script must be a .py or .js file")
        command = "python" if is_python else "node"
        server_params = StdioServerParameters(
            command=command,
            args=[server_script_path],
            env=None
        )
        stdio_transport = await self.exit_stack.enter_async_context(stdio_client(server_params))
        self.stdio, self.write = stdio_transport
        self.session = await self.exit_stack.enter_async_context(ClientSession(self.stdio, self.write))
        await self.session.initialize()
        # 从 Server 获取所有用的工具
        # List available tools
        response = await self.session.list_tools()
        tools = response.tools
        print("\nConnected to server with tools:", [tool.name for tool in tools])
    # 执行查询
    async def process_query(self, query: str) -> str:
        """Process a query using Claude and available tools"""
        messages = [
            {
                "role": "user",
                "content": query
            }
        ]
        # 1.从 Server 获取到所有可用的客户端后
        response = await self.session.list_tools()
        # 2.转换为 GPT 需要的格式
        available_tools = [{
            "type": "function",
            "function": {
                "name": tool.name,
                "description": tool.description,
                "parameters": tool.inputSchema
            }
        } for tool in response.tools]
        # 3.在调用的时候 带着 tools
        response = self.openai.chat.completions.create(
            model=self.model,
            max_tokens=1000,
            messages=messages,
            tools=available_tools
        )
        tool_results = []
        final_text = []
        for choice in response.choices:
            message = choice.message
            is_function_call = message.tool_calls
            if not is_function_call:
                final_text.append(message.content)
            else:
                tool_name = message.tool_calls[0].function.name
                tool_args = json.loads(message.tool_calls[0].function.arguments)
                print(f"Tool Call: {tool_name}")
                print(f"Tool Call Param: {json.dumps(tool_args, ensure_ascii=False, indent=2)}")
                result = await self.session.call_tool(tool_name, tool_args)
                tool_results.append({"call": tool_name, "result": result})
                final_text.append(f"[Calling tool {tool_name} with args {tool_args}]")
                if message.content and hasattr(message.content, 'text'):
                    messages.append({
                        "role": "assistant",
                        "content": message.content
                    })
                messages.append({
                    "role": "user",
                    "content": result.content
                })
                response = self.openai.chat.completions.create(
                    model=self.model,
                    max_tokens=1000,
                    messages=messages,
                    tools=available_tools
                )
                final_text.append(response.choices[0].message.content)
            return "\\n".join(final_text)
    async def chat_loop(self):
        """Run an interactive chat loop"""
        print("\nMCP Client Started!")
        print("Type your queries or 'quit' to exit.")
        while True:
            try:
                query = input("\nQuery: ").strip()
                if query.lower() == 'quit':
                    break
                response = await self.process_query(query)
                print("\n" + response)
            except Exception as e:
                print(f"\nError: {str(e)}")
    async def cleanup(self):
        """Clean up resources"""
        await self.exit_stack.aclose()
async def main():
    if len(sys.argv) < 2:
        print("Usage: python client.py <path_to_server_script>")
        sys.exit(1)
    client = MCPClient()
    try:
        await client.connect_to_server(sys.argv[1])
        await client.chat_loop()
    finally:
        await client.cleanup()
if __name__ == "__main__":
    import sys
    asyncio.run(main())
        测试项目
这里我们启动 Client 端、Server 端。

第一个参数是 Client 端,第二个是 Server 端:
            
            
              shell
              
              
            
          
          uv run main.py ../weather/main.py
        启动之后,可以看到从 Server 端获取到了目前有的工具:

我们提问:"What are the weather alerts in California"。
PS:注意,这里需要是美国地址,我们 Server 端中的URL是美国的。
在下面的日志中,我们可以看到 Client 中输出的:
            
            
              shell
              
              
            
          
          Tool Call: get_alerts
Tool Call Param: {
  "state": "CA"
}
        全部的内容如下所示:
            
            
              shell
              
              
            
          
          ➜ uv run main.py ../weather/main.py
Connected to server with tools: ['get_alerts', 'get_forecast']
MCP Client Started!
Type your queries or 'quit' to exit.
Query: What are the weather alerts in California
Tool Call: get_alerts
Tool Call Param: {
  "state": "CA"
}
[Calling tool get_alerts with args {'state': 'CA'}]\nHere are the current weather alerts in California:
1. **Lake Wind Advisory**
   - **Area**: Greater Lake Tahoe Area
   - **Severity**: Moderate
   - **Description**: Southwest winds 15 to 25 mph with gusts up to 45 mph and waves 1 to 4 feet.
   - **When**: Until 5 AM PDT Thursday.
   - **Impacts**: Small boats, kayaks, and paddle boards will be prone to capsizing and should stay off lake waters until conditions improve.
   - **Instructions**: Check lake conditions before heading out and consider postponing boating activities until a day with less wind.
2. **Frost Advisory**
   - **Area**: Southeastern Mendocino Interior; Southern Lake County
   - **Severity**: Minor
   - **Description**: Temperatures as low as 33 will result in frost formation.
   - **When**: From 2 AM to 10 AM PDT Thursday.
   - **Impacts**: Frost could harm sensitive outdoor vegetation.
   - **Instructions**: Take steps now to protect tender plants from the cold.
3. **Freeze Warning**
   - **Area**: Northeastern Mendocino Interior
   - **Severity**: Moderate
   - **Description**: Sub-freezing temperatures as low as 30 expected.
   - **When**: From 3 AM to 10 AM PDT Thursday.
   - **Impacts**: Frost/Freeze damage to crops.
   - **Instructions**: Take steps now to protect tender plants from the cold.
4. **Freeze Warning**
   - **Area**: Northwestern Mendocino Interior
   - **Severity**: Moderate
   - **Description**: Sub-freezing temperatures as low as 29 expected.
   - **When**: From 3 AM to 10 AM PDT Thursday.
   - **Impacts**: Frost/Freeze damage to crops.
   - **Instructions**: Take steps now to protect tender plants from the cold.
5. **Freeze Warning**
   - **Area**: Northern Lake County
   - **Severity**: Moderate
   - **Description**: Sub-freezing temperatures as low as 29 expected.
   - **When**: From 3 AM to 10 AM PDT Thursday.
   - **Impacts**: Frost/Freeze damage to crops.
   - **Instructions**: Take steps now to protect tender plants from the cold.
6. **Winter Weather Advisory**
   - **Area**: Northern Trinity
   - **Severity**: Moderate
   - **Description**: Snow above 4000 feet, with additional snow accumulations up to one inch.
   - **When**: Until 1 AM PDT Thursday.
   - **Impacts**: Plan on slippery road conditions.
   - **Instructions**: Slow down and use caution while traveling.
7. **Winter Weather Advisory**
   - **Area**: Western Siskiyou County
   - **Severity**: Moderate
   - **Description**: Snow expected, with total accumulations between 6 and 8 inches and up to 12 inches over high remote terrain. Winds gusting as high as 55 mph.
   - **When**: Until 11 AM PDT Thursday.
   - **Impacts**: Travel could be difficult, affecting commutes.
   - **Instructions**: Slow down and use caution while traveling. Call 511 or visit quickmap.dot.ca.gov for road information.
Please take necessary precautions based on these alerts!
Query:
        对应的截图如下所示:
