一篇文章尽快介绍入门级智能体Agent是什么回事,
Starter AI Agents 项目来自github: GitHub - Shubhamsaboo/awesome-llm-apps: Collection of awesome LLM apps with AI Agents and RAG using OpenAI, Anthropic, Gemini and opensource models.
Starter AI Agents 介绍了多种实用AI工具,包括博客转播客、分手疗愈、数据分析、医学影像、表情包生成、音乐创作、旅行规划、多模态处理、新闻聚合、混合代理协作、金融分析、文献解析和网页抓取等功能,覆盖生活、娱乐、健康、金融等多个领域。
1,agno框架开发agent: 博客转为播客
agno框架应该是最简单同时最完整的agent框架了,很简单的就实现了一个完整的agent:
ini
from agno.agent import Agent
blog_to_podcast_agent = Agent(
name="Blog to Podcast Agent",
agent_id="blog_to_podcast_agent",
model=OpenAIChat(id="gpt-4o"),
tools=[
ElevenLabsTools(
voice_id="JBFqnCBsd6RMkjVDRZzb",
model_id="eleven_multilingual_v2",
target_directory="audio_generations",
),
FirecrawlTools(),
],
description="You are an AI agent that can generate audio using the ElevenLabs API.",
instructions=[
"When the user provides a blog URL:",
"1. Use FirecrawlTools to scrape the blog content",
"2. Create a concise summary of the blog content that is NO MORE than 2000 characters long",
"3. The summary should capture the main points while being engaging and conversational",
"4. Use the ElevenLabsTools to convert the summary to audio",
"Ensure the summary is within the 2000 character limit to avoid ElevenLabs API limits",
],
markdown=True,
debug_mode=True,
)
podcast: RunResponse = blog_to_podcast_agent.run(
f"Convert the blog content to a podcast: {url}"
)
2,基于swarm的多智能体写新闻
OpenAI Swarm 是一种多智能体协作技术,通过多个AI模型(如不同版本的GPT)并行或协同工作,结合各自优势以提高任务处理的准确性、可靠性和效率。适用于复杂问题分析、数据验证、内容生成等场景,比如本地新闻聚合(多模型交叉验证信息)、研究辅助(多角度解析文献)等。其核心思想是"群体智能",通过多样化模型降低单一AI的偏差或错误风险。
ini
from duckduckgo_search import DDGS
from swarm import Swarm, Agent
#定义搜索新闻工具
def search_news(topic):
"""Search for news articles using DuckDuckGo"""
with DDGS() as ddg:
results = ddg.text(f"{topic} news {datetime.now().strftime('%Y-%m')}", max_results=3)
if results:
news_results = "\n\n".join([
f"Title: {result['title']}\nURL: {result['href']}\nSummary: {result['body']}"
for result in results
])
return news_results
return f"No news found for {topic}."
# 搜索智能体 agents
search_agent = Agent(
name="News Searcher",
instructions="""
You are a news search specialist. Your task is to:
1. Search for the most relevant and recent news on the given topic
2. Ensure the results are from reputable sources
3. Return the raw search results in a structured format
""",
functions=[search_news],
model=MODEL
)
# 整合新闻智能体
synthesis_agent = Agent(
name="News Synthesizer",
instructions="""
You are a news synthesis expert. Your task is to:
1. Analyze the raw news articles provided
2. Identify the key themes and important information
3. Combine information from multiple sources
4. Create a comprehensive but concise synthesis
5. Focus on facts and maintain journalistic objectivity
6. Write in a clear, professional style
Provide a 2-3 paragraph synthesis of the main points.
""",
model=MODEL
)
# 总结智能体
summary_agent = Agent(
name="News Summarizer",
instructions="""
You are an expert news summarizer combining AP and Reuters style clarity with digital-age brevity.
Your task:
1. Core Information:
- Lead with the most newsworthy development
- Include key stakeholders and their actions
- Add critical numbers/data if relevant
- Explain why this matters now
- Mention immediate implications
2. Style Guidelines:
- Use strong, active verbs
- Be specific, not general
- Maintain journalistic objectivity
- Make every word count
- Explain technical terms if necessary
Format: Create a single paragraph of 250-400 words that informs and engages.
Pattern: [Major News] + [Key Details/Data] + [Why It Matters/What's Next]
Focus on answering: What happened? Why is it significant? What's the impact?
IMPORTANT: Provide ONLY the summary paragraph. Do not include any introductory phrases,
labels, or meta-text like "Here's a summary" or "In AP/Reuters style."
Start directly with the news content.
""",
model=MODEL
)
# 开始运行多智能体, 第一步:搜索新闻
search_response = client.run(
agent=search_agent,
messages=[{"role": "user", "content": f"Find recent news about {topic}"}]
)
raw_news = search_response.messages[-1]["content"]
# 开始运行多智能体, 第2步:整合新闻
# Synthesize
status.write("🔄 Synthesizing information...")
synthesis_response = client.run(
agent=synthesis_agent,
messages=[{"role": "user", "content": f"Synthesize these news articles:\n{raw_news}"}]
)
synthesized_news = synthesis_response.messages[-1]["content"]
# 开始运行多智能体, 第3步:总结新闻
# Summarize
status.write("📝 Creating summary...")
summary_response = client.run(
agent=summary_agent,
messages=[{"role": "user", "content": f"Summarize this synthesis:\n{synthesized_news}"}]
)
3,医学影像诊断代理 (复杂图片识别和agent处理)
能够同时处理图片,搜索 合作的智能体例子:
ini
from agno.agent import Agent
medical_agent = Agent(
model=Gemini(
id="gemini-2.0-flash",
api_key=st.session_state.GOOGLE_API_KEY
),
tools=[DuckDuckGoTools()],
markdown=True
) if st.session_state.GOOGLE_API_KEY else None
# Medical Analysis Query
query = """
You are a highly skilled medical imaging expert with extensive knowledge in radiology and diagnostic imaging. Analyze the patient's medical image and structure your response as follows:
### 1. Image Type & Region
- Specify imaging modality (X-ray/MRI/CT/Ultrasound/etc.)
- Identify the patient's anatomical region and positioning
- Comment on image quality and technical adequacy
### 2. Key Findings
- List primary observations systematically
- Note any abnormalities in the patient's imaging with precise descriptions
- Include measurements and densities where relevant
- Describe location, size, shape, and characteristics
- Rate severity: Normal/Mild/Moderate/Severe
### 3. Diagnostic Assessment
- Provide primary diagnosis with confidence level
- List differential diagnoses in order of likelihood
- Support each diagnosis with observed evidence from the patient's imaging
- Note any critical or urgent findings
### 4. Patient-Friendly Explanation
- Explain the findings in simple, clear language that the patient can understand
- Avoid medical jargon or provide clear definitions
- Include visual analogies if helpful
- Address common patient concerns related to these findings
### 5. Research Context
IMPORTANT: Use the DuckDuckGo search tool to:
- Find recent medical literature about similar cases
- Search for standard treatment protocols
- Provide a list of relevant medical links of them too
- Research any relevant technological advances
- Include 2-3 key references to support your analysis
Format your response using clear markdown headers and bullet points. Be concise yet thorough.
"""
agno_image = AgnoImage(filepath=temp_path) # Adjust if constructor differs
# Run analysis
response = medical_agent.run(query, images=[agno_image])
4, agent数据分析:
ini
from agno.models.openai import OpenAIChat
from phi.agent.duckdb import DuckDbAgent
from agno.tools.pandas import PandasTools
semantic_model = {
"tables": [
{
"name": "uploaded_data",
"description": "Contains the uploaded dataset.",
"path": temp_path,
}
]
}
# Initialize the DuckDbAgent for SQL query generation
duckdb_agent = DuckDbAgent(
model=OpenAIChat(model="gpt-4", api_key=st.session_state.openai_key),
semantic_model=json.dumps(semantic_model),
tools=[PandasTools()],
markdown=True,
add_history_to_messages=False, # Disable chat history
followups=False, # Disable follow-up queries
read_tool_call_history=False, # Disable reading tool call history
system_prompt="You are an expert data analyst. Generate SQL queries to solve the user's query. Return only the SQL query, enclosed in ```sql ``` and give the final answer.",
)
5,旅游规划agent:(research + plan)
ini
researcher = Agent(
name="Researcher",
role="Searches for travel destinations, activities, and accommodations based on user preferences",
model=OpenAIChat(id="gpt-4o", api_key=openai_api_key),
description=dedent(
"""
You are a world-class travel researcher. Given a travel destination and the number of days the user wants to travel for,
generate a list of search terms for finding relevant travel activities and accommodations.
Then search the web for each term, analyze the results, and return the 10 most relevant results.
"""
),
instructions=[
"Given a travel destination and the number of days the user wants to travel for, first generate a list of 3 search terms related to that destination and the number of days.",
"For each search term, `search_google` and analyze the results."
"From the results of all searches, return the 10 most relevant results to the user's preferences.",
"Remember: the quality of the results is important.",
],
tools=[SerpApiTools(api_key=serp_api_key)],
add_datetime_to_instructions=True,
)
planner = Agent(
name="Planner",
role="Generates a draft itinerary based on user preferences and research results",
model=OpenAIChat(id="gpt-4o", api_key=openai_api_key),
description=dedent(
"""
You are a senior travel planner. Given a travel destination, the number of days the user wants to travel for, and a list of research results,
your goal is to generate a draft itinerary that meets the user's needs and preferences.
"""
),
instructions=[
"Given a travel destination, the number of days the user wants to travel for, and a list of research results, generate a draft itinerary that includes suggested activities and accommodations.",
"Ensure the itinerary is well-structured, informative, and engaging.",
"Ensure you provide a nuanced and balanced itinerary, quoting facts where possible.",
"Remember: the quality of the itinerary is important.",
"Focus on clarity, coherence, and overall quality.",
"Never make up facts or plagiarize. Always provide proper attribution.",
],
add_datetime_to_instructions=True,
)
```
research_results = researcher.run(f"Research {destination} for a {num_days} day trip", stream=False)
prompt = f"""
Destination: {destination}
Duration: {num_days} days
Research Results: {research_results.content}
Please create a detailed itinerary based on this research.
"""
response = planner.run(prompt, stream=False)
```
## [](https://linux.do/u/liangdabiao/activity/pending#h-6)爬虫智能体:
利用了ScrapeGraphAI -- AI网络爬虫工具,自动分析目标网页结构提取关键数据
```
from scrapegraphai.graphs import SmartScraperGraph
model = st.radio(
"Select the model",
["gpt-3.5-turbo", "gpt-4"],
index=0,
)
graph_config = {
"llm": {
"api_key": openai_access_token,
"model": model,
},
}
# Get the URL of the website to scrape
url = st.text_input("Enter the URL of the website you want to scrape")
# Get the user prompt
user_prompt = st.text_input("What you want the AI agent to scrae from the website?")
# Create a SmartScraperGraph object
smart_scraper_graph = SmartScraperGraph(
prompt=user_prompt,
source=url,
config=graph_config
)
Deep Research 智能体:(基于openai-agents)
OpenAI Agents 是由 OpenAI 技术驱动的智能代理(AI Agents),能够执行各种自动化任务,如数据分析、内容生成、信息检索和多模态交互等。它们基于强大的语言模型(如 GPT-4、GPT-4o)或 OpenAI API 构建,具备自然语言理解、推理和任务执行能力,可应用于商业、科研、娱乐等多个领域。
ini
from agents import Agent, Runner, trace
from agents import set_default_openai_key
from firecrawl import FirecrawlApp
from agents.tool import function_tool
# Research topic input
research_topic = st.text_input("Enter your research topic:", placeholder="e.g., Latest developments in AI")
deep_research = firecrawl_app.deep_research(
query=query,
params=params,
on_activity=on_activity
)
# 调研agent
research_agent = Agent(
name="research_agent",
instructions="""You are a research assistant that can perform deep web research on any topic.
When given a research topic or question:
1. Use the deep_research tool to gather comprehensive information
- Always use these parameters:
* max_depth: 3 (for moderate depth)
* time_limit: 180 (3 minutes)
* max_urls: 10 (sufficient sources)
2. The tool will search the web, analyze multiple sources, and provide a synthesis
3. Review the research results and organize them into a well-structured report
4. Include proper citations for all sources
5. Highlight key findings and insights
""",
tools=[deep_research]
)
# 进一步详细说明agent
elaboration_agent = Agent(
name="elaboration_agent",
instructions="""You are an expert content enhancer specializing in research elaboration.
When given a research report:
1. Analyze the structure and content of the report
2. Enhance the report by:
- Adding more detailed explanations of complex concepts
- Including relevant examples, case studies, and real-world applications
- Expanding on key points with additional context and nuance
- Adding visual elements descriptions (charts, diagrams, infographics)
- Incorporating latest trends and future predictions
- Suggesting practical implications for different stakeholders
3. Maintain academic rigor and factual accuracy
4. Preserve the original structure while making it more comprehensive
5. Ensure all additions are relevant and valuable to the topic
"""
)
# 运行
async def run_research_process(topic: str):
research_result = await Runner.run(research_agent, topic)
initial_report = research_result.final_output
elaboration_input = f"""
RESEARCH TOPIC: {topic}
INITIAL RESEARCH REPORT:
{initial_report}
Please enhance this research report with additional information, examples, case studies,
and deeper insights while maintaining its academic rigor and factual accuracy.
"""
elaboration_result = await Runner.run(elaboration_agent, elaboration_input)
enhanced_report = elaboration_result.final_output
return enhanced_report
# Run the research process
enhanced_report = asyncio.run(run_research_process(research_topic))