这是用户在 2025-8-7 9:54 为 https://docs.langchain.com/langgraph-platform/quick-start-studio 保存的双语快照页面,由 沉浸式翻译 提供双语支持。了解如何保存?
LangGraph Studio supports connecting to two types of graphs:
LangGraph Studio 支持连接两种类型的图:
  • Graphs deployed on LangGraph Platform.
    部署在 LangGraph 平台上的图。
  • Graphs running locally via the LangGraph Server.
    通过 LangGraph 服务器本地运行的图。
LangGraph Studio is accessed from the LangSmith UI, within the LangGraph Platform Deployments tab.
LangGraph Studio 通过 LangSmith UI 访问,位于 LangGraph 平台部署选项卡中。
For applications that are deployed on LangGraph Platform, you can access Studio as part of that deployment. To do so, navigate to the deployment in LangGraph Platform within the LangSmith UI and click the “LangGraph Studio” button.
对于部署在 LangGraph 平台上的应用程序,您可以将 Studio 作为该部署的一部分进行访问。为此,请在 LangSmith UI 中导航到 LangGraph 平台中的该部署,然后点击“LangGraph Studio”按钮。
This will load the Studio UI connected to your live deployment, allowing you to create, read, and update the threads, assistants, and memory in that deployment.
这将加载与您的实时部署连接的 Studio UI,允许您创建、读取和更新该部署中的线程、助手和内存。

Local development server  本地开发服务器

To test your locally running application using LangGraph Studio, ensure your application is set up following this guide.
要使用 LangGraph Studio 测试本地运行的应用程序,请确保您的应用程序按照本指南进行设置。
LangSmith Tracing For local development, if you do not wish to have data traced to LangSmith, set LANGSMITH_TRACING=false in your application’s .env file. With tracing disabled, no data will leave your local server.
LangSmith 跟踪 对于本地开发,如果您不希望将数据跟踪到 LangSmith,请在应用程序的 .env 文件中设置 LANGSMITH_TRACING=false 。禁用跟踪后,没有数据会离开您的本地服务器。
Next, install the LangGraph CLI:
接下来,安装 LangGraph CLI:
pip install -U "langgraph-cli[inmem]"
and run:  并运行:
langgraph dev
Browser Compatibility Safari blocks localhost connections to Studio. To work around this, run the above command with --tunnel to access Studio via a secure tunnel.
浏览器兼容性 Safari 会阻止与 Studio 的 localhost 连接。为解决这个问题,请使用 --tunnel 运行上述命令,通过安全隧道访问 Studio。
This will start the LangGraph Server locally, running in-memory. The server will run in watch mode, listening for and automatically restarting on code changes. Read this reference to learn about all the options for starting the API server.
这将本地启动 LangGraph 服务器,并在内存中运行。服务器将以监视模式运行,监听代码更改并自动重启。阅读此参考以了解启动 API 服务器的所有选项。
If successful, you will see the following logs:
如果成功,您将看到以下日志:
Ready!  准备就绪!
Once running, you will automatically be directed to LangGraph Studio.
运行后,您将自动被导向 LangGraph Studio。
For an already running server, access Studio by either:
对于已经运行的服务器,可以通过以下方式访问工作室:
  1. Directly navigate to the following URL: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024.
    直接导航到以下 URL: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024
  2. Within LangSmith, navigate to the LangGraph Platform Deployments tab, click the “LangGraph Studio” button, enter http://127.0.0.1:2024 and click “Connect”.
    在 LangSmith 中,导航到 LangGraph 平台部署选项卡,点击“LangGraph Studio”按钮,输入 http://127.0.0.1:2024 并点击“连接”。
If running your server at a different host or port, simply update the baseUrl to match.
如果您的服务器运行在不同的主机或端口上,只需将 baseUrl 更新为匹配即可。

(Optional) Attach a debugger
(可选)附加调试器

For step-by-step debugging with breakpoints and variable inspection:
使用断点和变量检查进行逐步调试:
# Install debugpy package
pip install debugpy

# Start server with debugging enabled
langgraph dev --debug-port 5678
Then attach your preferred debugger:
然后附加您喜欢的调试器:
Add this configuration to launch.json:
将此配置添加到 launch.json :
{
    "name": "Attach to LangGraph",
    "type": "debugpy",
    "request": "attach",
    "connect": {
      "host": "0.0.0.0",
      "port": 5678
    }
}

Troubleshooting

For issues getting started, please see this troubleshooting guide.

Next steps

See the following guides for more information on how to use Studio: