Autogen_core:Agent and Agent Runtime

目录

1. 代码

python 复制代码
from dataclasses import dataclass

from autogen_core import AgentId, MessageContext, RoutedAgent, message_handler


@dataclass
class MyMessageType:
    content: str


class MyAgent(RoutedAgent):
    def __init__(self) -> None:
        super().__init__("MyAgent")

    @message_handler
    async def handle_my_message_type(self, message: MyMessageType, ctx: MessageContext) -> None:
        print(f"{self.id.type} received message: {message.content}")
python 复制代码
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.messages import TextMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient

class MyAssistant(RoutedAgent):
    def __init__(self, name: str) -> None:
        super().__init__(name)
        model_client =  OpenAIChatCompletionClient(
                    model="GLM-4-Air-0111",
                    api_key = "your api key",
                    base_url="https://open.bigmodel.cn/api/paas/v4/",
                    model_capabilities={
                "vision": True,
                "function_calling": True,
                "json_output": True,
            }
                    )
        self._delegate = AssistantAgent(name, model_client=model_client)

    @message_handler
    async def handle_my_message_type(self, message: MyMessageType, ctx: MessageContext) -> None:
        print(f"{self.id.type} received message: {message.content}")
        response = await self._delegate.on_messages(
            [TextMessage(content=message.content, source="user")], ctx.cancellation_token
        )
        print(f"{self.id.type} responded: {response.chat_message.content}")
python 复制代码
from autogen_core import SingleThreadedAgentRuntime

runtime = SingleThreadedAgentRuntime()
await MyAgent.register(runtime, "my_agent", lambda: MyAgent())
await MyAssistant.register(runtime, "my_assistant", lambda: MyAssistant("my_assistant"))
复制代码
AgentType(type='my_assistant')
python 复制代码
runtime.start()  # Start processing messages in the background.
await runtime.send_message(MyMessageType("Hello, World!"), AgentId("my_agent", "default"))
await runtime.send_message(MyMessageType("Hello, World!"), AgentId("my_assistant", "default"))
await runtime.stop()  # Stop processing messages in the background.
复制代码
my_agent received message: Hello, World!
my_assistant received message: Hello, World!
my_assistant responded: Hello! How can I help you today?

2. 代码解释

这段代码展示了如何使用AutoGen框架创建和运行两个代理(Agent),其中每个代理可以接收和响应特定类型的消息。以下是代码逻辑的解释:

第一部分:定义消息类型和代理

python 复制代码
from dataclasses import dataclass

from autogen_core import AgentId, MessageContext, RoutedAgent, message_handler


@dataclass
class MyMessageType:
    content: str


class MyAgent(RoutedAgent):
    def __init__(self) -> None:
        super().__init__("MyAgent")

    @message_handler
    async def handle_my_message_type(self, message: MyMessageType, ctx: MessageContext) -> None:
        print(f"{self.id.type} received message: {message.content}")
  1. 定义消息类型 :使用@dataclass定义了一个数据类MyMessageType,用于表示消息,其中包含一个content字段。
  2. 定义代理类 :创建了一个名为MyAgent的代理类,继承自RoutedAgent。在初始化方法中,调用父类初始化并传入代理名称。
  3. 消息处理方法 :使用@message_handler装饰器定义了一个异步方法handle_my_message_type,用于处理MyMessageType类型的消息。当接收到消息时,打印消息内容。

第二部分:定义助手代理

python 复制代码
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.messages import TextMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient

class MyAssistant(RoutedAgent):
    def __init__(self, name: str) -> None:
        super().__init__(name)
        model_client =  OpenAIChatCompletionClient(
                    model="GLM-4-Air-0111",
                    api_key = "your api key",
                    base_url="https://open.bigmodel.cn/api/paas/v4/",
                    model_capabilities={
                "vision": True,
                "function_calling": True,
                "json_output": True,
            }
                    )
        self._delegate = AssistantAgent(name, model_client=model_client)

    @message_handler
    async def handle_my_message_type(self, message: MyMessageType, ctx: MessageContext) -> None:
        print(f"{self.id.type} received message: {message.content}")
        response = await self._delegate.on_messages(
            [TextMessage(content=message.content, source="user")], ctx.cancellation_token
        )
        print(f"{self.id.type} responded: {response.chat_message.content}")
  1. 定义助手代理类 :创建了一个名为MyAssistant的代理类,继承自RoutedAgent
  2. 初始化助手代理 :在初始化方法中,创建了一个OpenAIChatCompletionClient实例,用于与OpenAI的聊天模型进行交互,并初始化了一个AssistantAgent作为代理的委托对象。
  3. 消息处理方法 :定义了一个异步方法handle_my_message_type,用于处理接收到的消息。当接收到消息时,打印消息内容,并通过委托对象_delegate将消息发送给OpenAI聊天模型,打印模型的响应。

第三部分:注册和运行代理

python 复制代码
from autogen_core import SingleThreadedAgentRuntime

runtime = SingleThreadedAgentRuntime()
await MyAgent.register(runtime, "my_agent", lambda: MyAgent())
await MyAssistant.register(runtime, "my_assistant", lambda: MyAssistant("my_assistant"))
  1. 创建运行时环境 :创建了一个SingleThreadedAgentRuntime实例,用于管理代理的运行。
  2. 注册代理 :使用register方法注册了MyAgentMyAssistant代理。

第四部分:发送和停止消息处理

python 复制代码
runtime.start()  # Start processing messages in the background.
await runtime.send_message(MyMessageType("Hello, World!"), AgentId("my_agent", "default"))
await runtime.send_message(MyMessageType("Hello, World!"), AgentId("my_assistant", "default"))
await runtime.stop()  # Stop processing messages in the background.
  1. 启动消息处理 :调用runtime.start()启动后台消息处理。
  2. 发送消息 :使用runtime.send_message向两个代理发送MyMessageType类型的消息。
  3. 停止消息处理 :调用runtime.stop()停止后台消息处理。

总结

这段代码展示了如何使用AutoGen框架创建两个代理,其中MyAgent用于接收和打印消息,MyAssistant用于接收消息并通过OpenAI聊天模型生成响应。通过运行时环境SingleThreadedAgentRuntime,可以启动、发送消息和停止代理的消息处理。

3. 类似的例子

python 复制代码
from dataclasses import dataclass

from autogen_core import AgentId, MessageContext, RoutedAgent, message_handler


# Define a new message type that contains a number and the operation to be performed
@dataclass
class MathOperationMessage:
    number1: float
    number2: float
    operation: str  # operation can be 'add', 'subtract', 'multiply', 'divide'


class MathAgent(RoutedAgent):
    def __init__(self) -> None:
        super().__init__("MathAgent")

    @message_handler
    async def handle_math_operation_message(self, message: MathOperationMessage, ctx: MessageContext) -> None:
        if message.operation == "add":
            result = message.number1 + message.number2
        elif message.operation == "subtract":
            result = message.number1 - message.number2
        elif message.operation == "multiply":
            result = message.number1 * message.number2
        elif message.operation == "divide":
            if message.number2 != 0:
                result = message.number1 / message.number2
            else:
                result = "Error: Division by zero"
        else:
            result = "Error: Unknown operation"

        print(f"{self.id.type} received message: {message}")
        print(f"{self.id.type} performed operation: {message.operation} on {message.number1} and {message.number2}")
        print(f"{self.id.type} result: {result}")


# Define another assistant agent that can respond with basic information
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.messages import TextMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient

class MathAssistant(RoutedAgent):
    def __init__(self, name: str) -> None:
        super().__init__(name)
        model_client = OpenAIChatCompletionClient(
            model="GLM-4-Air-0111",
            api_key="your api key",  # Replace with your actual API key
            base_url="https://open.bigmodel.cn/api/paas/v4/",
            model_capabilities={
                "vision": True,
                "function_calling": True,
                "json_output": True,
            }
        )
        self._delegate = AssistantAgent(name, model_client=model_client)

    @message_handler
    async def handle_math_message(self, message: MathOperationMessage, ctx: MessageContext) -> None:
        print(f"{self.id.type} received message: {message}")
        response = await self._delegate.on_messages(
            [TextMessage(content=f"Performing operation: {message.operation} on {message.number1} and {message.number2}", source="user")],
            ctx.cancellation_token
        )
        print(f"{self.id.type} responded: {response.chat_message.content}")


# Set up the runtime and agents
from autogen_core import SingleThreadedAgentRuntime

runtime = SingleThreadedAgentRuntime()
await MathAgent.register(runtime, "math_agent", lambda: MathAgent())
await MathAssistant.register(runtime, "math_assistant", lambda: MathAssistant("math_assistant"))

# Start processing messages
runtime.start()

# Send a message to the MathAgent for processing
await runtime.send_message(MathOperationMessage(3, 5, "add"), AgentId("math_agent", "default"))
await runtime.send_message(MathOperationMessage(10, 2, "subtract"), AgentId("math_agent", "default"))

# Send a message to the MathAssistant to provide context about the operation
await runtime.send_message(MathOperationMessage(7, 4, "multiply"), AgentId("math_assistant", "default"))

# Stop the runtime
await runtime.stop()
复制代码
math_agent received message: MathOperationMessage(number1=3, number2=5, operation='add')
math_agent performed operation: add on 3 and 5
math_agent result: 8
math_agent received message: MathOperationMessage(number1=10, number2=2, operation='subtract')
math_agent performed operation: subtract on 10 and 2
math_agent result: 8
math_assistant received message: MathOperationMessage(number1=7, number2=4, operation='multiply')
math_assistant responded: ```
7 * 4 = 28
```

TERMINATE

参考链接:https://microsoft.github.io/autogen/stable/user-guide/core-user-guide/framework/agent-and-agent-runtime.html

相关推荐
小雷FansUnion1 小时前
深入理解MCP架构:智能服务编排、上下文管理与动态路由实战
人工智能·架构·大模型·mcp
资讯分享周1 小时前
扣子空间PPT生产力升级:AI智能生成与多模态创作新时代
人工智能·powerpoint
叶子爱分享2 小时前
计算机视觉与图像处理的关系
图像处理·人工智能·计算机视觉
鱼摆摆拜拜2 小时前
第 3 章:神经网络如何学习
人工智能·神经网络·学习
一只鹿鹿鹿2 小时前
信息化项目验收,软件工程评审和检查表单
大数据·人工智能·后端·智慧城市·软件工程
张较瘦_2 小时前
[论文阅读] 人工智能 | 深度学习系统崩溃恢复新方案:DaiFu框架的原位修复技术
论文阅读·人工智能·深度学习
cver1232 小时前
野生动物检测数据集介绍-5,138张图片 野生动物保护监测 智能狩猎相机系统 生态研究与调查
人工智能·pytorch·深度学习·目标检测·计算机视觉·目标跟踪
学技术的大胜嗷3 小时前
离线迁移 Conda 环境到 Windows 服务器:用 conda-pack 摆脱硬路径限制
人工智能·深度学习·yolo·目标检测·机器学习
还有糕手3 小时前
西南交通大学【机器学习实验10】
人工智能·机器学习
江瀚视野3 小时前
百度文心大模型4.5系列正式开源,开源会给百度带来什么?
人工智能