Autogen4j: the Java version of Microsoft AutoGen

https://github.com/HamaWhiteGG/autogen4j

Java version of Microsoft AutoGen, Enable Next-Gen Large Language Model Applications.

1. What is AutoGen

AutoGen is a framework that enables the development of LLM applications using multiple agents that can converse with each other to solve tasks. AutoGen agents are customizable, conversable, and seamlessly allow human participation. They can operate in various modes that employ combinations of LLMs, human inputs, and tools.

The following example in the autogen4j-example.

2. Quickstart

2.1 Maven Repository

Prerequisites for building:

  • Java 17 or later
  • Unix-like environment (we use Linux, Mac OS X)
  • Maven (we recommend version 3.8.6 and require at least 3.5.4)
xml 复制代码
<dependency>
    <groupId>io.github.hamawhitegg</groupId>
    <artifactId>autogen4j-core</artifactId>
    <version>0.1.0</version>
</dependency>

2.2 Environment Setup

Using Autogen4j requires OpenAI's APIs, you need to set the environment variable.

shell 复制代码
export OPENAI_API_KEY=xxx

3. Multi-Agent Conversation Framework

Autogen enables the next-gen LLM applications with a generic multi-agent conversation framework. It offers customizable and conversable agents that integrate LLMs, tools, and humans.

By automating chat among multiple capable agents, one can easily make them collectively perform tasks autonomously or with human feedback, including tasks that require using tools via code.

Features of this use case include:

  • Multi-agent conversations: AutoGen agents can communicate with each other to solve tasks. This allows for more complex and sophisticated applications than would be possible with a single LLM.
  • Customization: AutoGen agents can be customized to meet the specific needs of an application. This includes the ability to choose the LLMs to use, the types of human input to allow, and the tools to employ.
  • Human participation: AutoGen seamlessly allows human participation. This means that humans can provide input and feedback to the agents as needed.

3.1 Auto Feedback From Code Execution Example

Auto Feedback From Code Execution Example

java 复制代码
// create an AssistantAgent named "assistant"
var assistant = AssistantAgent.builder()
        .name("assistant")
        .build();

var codeExecutionConfig = CodeExecutionConfig.builder()
        .workDir("data/coding")
        .build();
// create a UserProxyAgent instance named "user_proxy"
var userProxy = UserProxyAgent.builder()
        .name("user_proxy")
        .humanInputMode(NEVER)
        .maxConsecutiveAutoReply(10)
        .isTerminationMsg(e -> e.getContent().strip().endsWith("TERMINATE"))
        .codeExecutionConfig(codeExecutionConfig)
        .build();

// the assistant receives a message from the user_proxy, which contains the task description
userProxy.initiateChat(assistant,
        "What date is today? Compare the year-to-date gain for META and TESLA.");

// followup of the previous question
userProxy.send(assistant,
        "Plot a chart of their stock price change YTD and save to stock_price_ytd.png.");

The figure below shows an example conversation flow with Autogen4j.

After running, you can check the file coding_output.log for the output logs.

The final output is as shown in the following picture.

3.2 Group Chat Example

Group Chat Example

java 复制代码
var codeExecutionConfig = CodeExecutionConfig.builder()
        .workDir("data/group_chat")
        .lastMessagesNumber(2)
        .build();

// create a UserProxyAgent instance named "user_proxy"
var userProxy = UserProxyAgent.builder()
        .name("user_proxy")
        .systemMessage("A human admin.")
        .humanInputMode(TERMINATE)
        .codeExecutionConfig(codeExecutionConfig)
        .build();

// create an AssistantAgent named "coder"
var coder = AssistantAgent.builder()
        .name("coder")
        .build();

// create an AssistantAgent named "pm"
var pm = AssistantAgent.builder()
        .name("product_manager")
        .systemMessage("Creative in software product ideas.")
        .build();

var groupChat = GroupChat.builder()
        .agents(List.of(userProxy, coder, pm))
        .maxRound(12)
        .build();

// create an GroupChatManager named "manager"
var manager = GroupChatManager.builder()
        .groupChat(groupChat)
        .build();

userProxy.initiateChat(manager,
        "Find a latest paper about gpt-4 on arxiv and find its potential applications in software.");

After running, you can check the file group_chat_output.log for the output logs.

4. Run Test Cases from Source

shell 复制代码
git clone https://github.com/HamaWhiteGG/autogen4j.git
cd autogen4j

# export JAVA_HOME=JDK17_INSTALL_HOME && mvn clean test
mvn clean test

This project uses Spotless to format the code.

If you make any modifications, please remember to format the code using the following command.

shell 复制代码
# export JAVA_HOME=JDK17_INSTALL_HOME && mvn spotless:apply
mvn spotless:apply

5. Support

Don't hesitate to ask!

Open an issue if you find a bug or need any help.

相关推荐
我在北国不背锅11 小时前
解决LangChain4j报错HTTP/1.1 header parser received no bytes
openai·langchain4j
HuggingFace13 小时前
大模型评估排障指南 | 关于可复现性
大模型·llm
AI大模型顾潇13 小时前
[特殊字符] 本地部署DeepSeek大模型:安全加固与企业级集成方案
数据库·人工智能·安全·大模型·llm·微调·llama
满怀101517 小时前
【AutoGen革命】多智能体协作系统的架构设计与工程实践
autogen·多智能体系统·ai工程化·llm应用开发·微软ai
一马平川的大草原2 天前
基于Dify实现对Excel的数据分析
数据分析·agent·dify高级应用
十里清风2 天前
LLM量化方法:ZeroQuant、LLM.int8()、SmoothQuant、GPTQ、AWQ
llm
青花瓷2 天前
llama-Factory不宜直接挂接Ollama的大模型
人工智能·大模型·agent·llama·智能体
知来者逆2 天前
在与大语言模型交互中的礼貌现象:技术影响、社会行为与文化意义的多维度探讨
人工智能·深度学习·语言模型·自然语言处理·llm
阿里云大数据AI技术2 天前
Hologres x 函数计算 x Qwen3,对接MCP构建企业级数据分析 Agent
大数据·数据分析·agent·hologres·qwen3
SHIPKING3932 天前
【Prompt工程—文生图】案例大全
llm·prompt·文生图