跳到主要内容
Open In ColabOpen on GitHub

ChatGoodfire

这将帮助您开始使用 Goodfire 聊天模型。有关所有 ChatGoodfire 功能和配置的详细文档,请访问 PyPI 项目页面,或直接访问 Goodfire SDK 文档。所有 Goodfire 特有的功能(例如 SAE 功能、变体等)都可通过主 goodfire 包获得。此集成是 Goodfire SDK 的包装器。

概述

集成详情

本地可序列化JS 支持包下载量包最新版本
ChatGoodfirelangchain-goodfirePyPI - DownloadsPyPI - Version

模型功能

工具调用结构化输出JSON 模式图像输入音频输入视频输入令牌级流式传输原生异步令牌使用量Logprobs

设置

要访问 Goodfire 模型,您需要创建一个 Goodfire 帐户,获取 API 密钥,并安装 langchain-goodfire 集成包。

凭据

前往 Goodfire 设置 注册 Goodfire 并生成 API 密钥。完成后,设置 GOODFIRE_API_KEY 环境变量。

import getpass
import os

if not os.getenv("GOODFIRE_API_KEY"):
os.environ["GOODFIRE_API_KEY"] = getpass.getpass("Enter your Goodfire API key: ")

如果您想获得模型调用的自动追踪,您还可以通过取消注释下方内容来设置您的 LangSmith API 密钥

# os.environ["LANGSMITH_TRACING"] = "true"
# os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")

安装

LangChain Goodfire 集成位于 langchain-goodfire 包中

%pip install -qU langchain-goodfire
Note: you may need to restart the kernel to use updated packages.

实例化

现在我们可以实例化我们的模型对象并生成聊天完成

import goodfire
from langchain_goodfire import ChatGoodfire

base_variant = goodfire.Variant("meta-llama/Llama-3.3-70B-Instruct")

llm = ChatGoodfire(
model=base_variant,
temperature=0,
max_completion_tokens=1000,
seed=42,
)
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.

调用

messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
ai_msg = await llm.ainvoke(messages)
ai_msg
AIMessage(content="J'adore la programmation.", additional_kwargs={}, response_metadata={}, id='run-8d43cf35-bce8-4827-8935-c64f8fb78cd0-0', usage_metadata={'input_tokens': 51, 'output_tokens': 39, 'total_tokens': 90})
print(ai_msg.content)
J'adore la programmation.

链接

我们可以像这样使用提示模板链接我们的模型

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human", "{input}"),
]
)

chain = prompt | llm
await chain.ainvoke(
{
"input_language": "English",
"output_language": "German",
"input": "I love programming.",
}
)
API 参考:ChatPromptTemplate
AIMessage(content='Ich liebe das Programmieren. How can I help you with programming today?', additional_kwargs={}, response_metadata={}, id='run-03d1a585-8234-46f1-a8df-bf9143fe3309-0', usage_metadata={'input_tokens': 46, 'output_tokens': 46, 'total_tokens': 92})

Goodfire 特有功能

要使用 Goodfire 特有的功能(例如 SAE 功能和变体),您可以直接使用 goodfire 包。

client = goodfire.Client(api_key=os.environ["GOODFIRE_API_KEY"])

pirate_features = client.features.search(
"assistant should roleplay as a pirate", base_variant
)
pirate_features
FeatureGroup([
0: "The assistant should adopt the persona of a pirate",
1: "The assistant should roleplay as a pirate",
2: "The assistant should engage with pirate-themed content or roleplay as a pirate",
3: "The assistant should roleplay as a character",
4: "The assistant should roleplay as a specific character",
5: "The assistant should roleplay as a game character or NPC",
6: "The assistant should roleplay as a human character",
7: "Requests for the assistant to roleplay or pretend to be something else",
8: "Requests for the assistant to roleplay or pretend to be something",
9: "The assistant is being assigned a role or persona to roleplay"
])
pirate_variant = goodfire.Variant("meta-llama/Llama-3.3-70B-Instruct")

pirate_variant.set(pirate_features[0], 0.4)
pirate_variant.set(pirate_features[1], 0.3)

await llm.ainvoke("Tell me a joke", model=pirate_variant)
AIMessage(content='Why did the scarecrow win an award? Because he was outstanding in his field! Arrr! Hope that made ye laugh, matey!', additional_kwargs={}, response_metadata={}, id='run-7d8bd30f-7f80-41cb-bdb6-25c29c22a7ce-0', usage_metadata={'input_tokens': 35, 'output_tokens': 60, 'total_tokens': 95})

API 参考

有关所有 ChatGoodfire 功能和配置的详细文档,请访问 API 参考


此页是否对您有帮助?