这是用户在 2025-7-9 15:17 为 https://huggingface.co/docs/smolagents/main/en/index 保存的双语快照页面,由 沉浸式翻译 提供双语支持。了解如何保存?

smolagents documentation

smolagents

You are viewing main version, which requires installation from source. If you'd like regular pip install, checkout the latest stable version (v1.19.0).
您正在查看主版本,需要从源代码安装。如果您想使用常规的 pip 安装,请查看最新的稳定版本(v1.19.0)。

smolagents

What is smolagents?  什么是 smolagents?

smolagents is an open-source Python library designed to make it extremely easy to build and run agents using just a few lines of code.
smolagents 是一个开源的 Python 库,旨在让您仅用几行代码就能极其轻松地构建和运行代理。

Key features of smolagents include:
smolagents 的主要功能包括:

Simplicity: The logic for agents fits in ~thousand lines of code. We kept abstractions to their minimal shape above raw code!
✨ 简洁性:代理的逻辑仅需约一千行代码即可实现。我们将抽象概念保持在原始代码之上的最小形态!

🧑‍💻 First-class support for Code Agents: CodeAgent writes its actions in code (as opposed to “agents being used to write code”) to invoke tools or perform computations, enabling natural composability (function nesting, loops, conditionals). To make it secure, we support executing in sandboxed environment via E2B or via Docker.
🧑‍💻 对代码代理的一流支持: CodeAgent 以代码形式编写其操作(而不是“使用代理来编写代码”),以调用工具或执行计算,从而实现自然的组合性(函数嵌套、循环、条件)。为了确保安全,我们支持通过 E2B 或 Docker 在沙盒环境中执行。

📡 Common Tool-Calling Agent Support: In addition to CodeAgents, ToolCallingAgent supports usual JSON/text-based tool-calling for scenarios where that paradigm is preferred.
📡 常见的工具调用代理支持:除了代码代理, ToolCallingAgent 还支持常用的基于 JSON/文本的工具调用,以满足偏好该范例的场景。

🤗 Hub integrations: Seamlessly share and load agents and tools to/from the Hub as Gradio Spaces.
🤗 Hub 集成:无缝地在 Hub 上作为 Gradio Spaces 分享和加载代理及工具。

🌐 Model-agnostic: Easily integrate any large language model (LLM), whether it’s hosted on the Hub via Inference providers, accessed via APIs such as OpenAI, Anthropic, or many others via LiteLLM integration, or run locally using Transformers or Ollama. Powering an agent with your preferred LLM is straightforward and flexible.
🌐 模型无关:轻松集成任何大型语言模型(LLM),无论它是通过推理服务商托管在 Hub 上,还是通过 OpenAI、Anthropic 等 API 访问,亦或是通过 LiteLLM 集成访问其他更多模型,还是使用 Transformers 或 Ollama 在本地运行。使用您偏好的 LLM 为 Agent 提供支持既简单又灵活。

👁️ Modality-agnostic: Beyond text, agents can handle vision, video, and audio inputs, broadening the range of possible applications. Check out this tutorial for vision.
👁️ 模态无关:除了文本之外,Agent 还可以处理视觉、视频和音频输入,从而扩大了可能的应用范围。请查看此视觉教程。

🛠️ Tool-agnostic: You can use tools from any MCP server, from LangChain, you can even use a Hub Space as a tool.
🛠️ 工具无关:您可以使用来自任何 MCP 服务器的工具,来自 LangChain 的工具,甚至可以将 Hub Space 用作工具。

💻 CLI Tools: Comes with command-line utilities (smolagent, webagent) for quickly running agents without writing boilerplate code.
💻 命令行工具:自带命令行实用程序(smolagent、webagent),无需编写样板代码即可快速运行 Agent。

Quickstart  快速入门

Get started with smolagents in just a few minutes! This guide will show you how to create and run your first agent.
只需几分钟即可开始使用 smolagents!本指南将向您展示如何创建和运行您的第一个代理。

Installation  安装

Install smolagents with pip:
使用 pip 安装 smolagents:

pip install smolagents[toolkit]  # Includes default tools like web search

Create Your First Agent  创建您的第一个代理

Here’s a minimal example to create and run an agent:
这是一个创建和运行代理的最小示例:

from smolagents import CodeAgent, InferenceClientModel

# Initialize a model (using Hugging Face Inference API)
model = InferenceClientModel()  # Uses a default model

# Create an agent with no tools
agent = CodeAgent(tools=[], model=model)

# Run the agent with a task
result = agent.run("Calculate the sum of numbers from 1 to 10")
print(result)

That’s it! Your agent will use Python code to solve the task and return the result.
就这样!你的 Agent 将使用 Python 代码来解决任务并返回结果。

Adding Tools  添加工具

Let’s make our agent more capable by adding some tools:
让我们通过添加一些工具来使我们的 Agent 更加强大:

from smolagents import CodeAgent, InferenceClientModel, DuckDuckGoSearchTool

model = InferenceClientModel()
agent = CodeAgent(
    tools=[DuckDuckGoSearchTool()],
    model=model,
)

# Now the agent can search the web!
result = agent.run("What is the current weather in Paris?")
print(result)

Using Different Models

You can use various models with your agent:

# Using a specific model from Hugging Face
model = InferenceClientModel(model_id="meta-llama/Llama-2-70b-chat-hf")

# Using OpenAI/Anthropic (requires smolagents[litellm])
from smolagents import LiteLLMModel
model = LiteLLMModel(model_id="gpt-4")

# Using local models (requires smolagents[transformers])
from smolagents import TransformersModel
model = TransformersModel(model_id="meta-llama/Llama-2-7b-chat-hf")

Next Steps

< > Update on GitHub