跳到主要内容
Open In ColabOpen on GitHub

PredictionGuardEmbeddings

Prediction Guard 是一个安全、可扩展的 GenAI 平台,可保护敏感数据,防止常见的 AI 故障,并在经济实惠的硬件上运行。

概述

集成详情

此集成展示了如何将 Prediction Guard 嵌入集成与 Langchain 一起使用。此集成支持文本和图像,可以单独使用,也可以在匹配对中使用。

设置

要访问 Prediction Guard 模型,请在此处联系我们以获取 Prediction Guard API 密钥并开始使用。

凭据

获得密钥后,您可以使用以下命令进行设置

import os

os.environ["PREDICTIONGUARD_API_KEY"] = "<Prediction Guard API Key"

安装

%pip install --upgrade --quiet langchain-predictionguard

实例化

首先,安装 Prediction Guard 和 LangChain 包。然后,设置所需的 env vars 并设置包导入。

from langchain_predictionguard import PredictionGuardEmbeddings
embeddings = PredictionGuardEmbeddings(model="bridgetower-large-itm-mlm-itc")

Prediction Guard 嵌入生成同时支持文本和图像。此集成包括跨各种功能的支持。

索引和检索

# Create a vector store with a sample text
from langchain_core.vectorstores import InMemoryVectorStore

text = "LangChain is the framework for building context-aware reasoning applications."

vectorstore = InMemoryVectorStore.from_texts(
[text],
embedding=embeddings,
)

# Use the vectorstore as a retriever
retriever = vectorstore.as_retriever()

# Retrieve the most similar text
retrieved_documents = retriever.invoke("What is LangChain?")

# Show the retrieved document's content
retrieved_documents[0].page_content
API 参考:InMemoryVectorStore
'LangChain is the framework for building context-aware reasoning applications.'

直接使用

vectorstore 和 retriever 实现正在调用 embeddings.embed_documents(...)embeddings.embed_query(...) 以从 from_texts 和检索 invoke 操作中使用的文本创建嵌入。

这些方法可以直接使用以下命令调用。

嵌入单个文本

# Embedding a single string
text = "This is an embedding example."
single_vector = embeddings.embed_query(text)

single_vector[:5]
[0.01456777285784483,
-0.08131945133209229,
-0.013045587576925755,
-0.09488929063081741,
-0.003087474964559078]

嵌入多个文本

# Embedding multiple strings
docs = [
"This is an embedding example.",
"This is another embedding example.",
]

two_vectors = embeddings.embed_documents(docs)

for vector in two_vectors:
print(vector[:5])
[0.01456777285784483, -0.08131945133209229, -0.013045587576925755, -0.09488929063081741, -0.003087474964559078]
[-0.0015021917643025517, -0.08883760124444962, -0.0025286630261689425, -0.1052245944738388, 0.014225339516997337]

嵌入单个图像

# Embedding a single image. These functions accept image URLs, image files, data URIs, and base64 encoded strings.
image = [
"https://farm4.staticflickr.com/3300/3497460990_11dfb95dd1_z.jpg",
]
single_vector = embeddings.embed_images(image)

print(single_vector[0][:5])
[0.0911610797047615, -0.034427884966135025, 0.007927080616354942, -0.03500846028327942, 0.022317267954349518]

嵌入多个图像

# Embedding multiple images
images = [
"https://fastly.picsum.photos/id/866/200/300.jpg?hmac=rcadCENKh4rD6MAp6V_ma-AyWv641M4iiOpe1RyFHeI",
"https://farm4.staticflickr.com/3300/3497460990_11dfb95dd1_z.jpg",
]

two_vectors = embeddings.embed_images(images)

for vector in two_vectors:
print(vector[:5])
[0.1593627631664276, -0.03636132553219795, -0.013229663483798504, -0.08789524435997009, 0.062290553003549576]
[0.0911610797047615, -0.034427884966135025, 0.007927080616354942, -0.03500846028327942, 0.022317267954349518]

嵌入单个文本-图像对

# Embedding a single text-image pair
inputs = [
{
"text": "This is an embedding example.",
"image": "https://farm4.staticflickr.com/3300/3497460990_11dfb95dd1_z.jpg",
},
]
single_vector = embeddings.embed_image_text(inputs)

print(single_vector[0][:5])
[0.0363212488591671, -0.10172265768051147, -0.014760786667466164, -0.046511903405189514, 0.03860781341791153]

嵌入多个文本-图像对

# Embedding multiple text-image pairs
inputs = [
{
"text": "This is an embedding example.",
"image": "https://fastly.picsum.photos/id/866/200/300.jpg?hmac=rcadCENKh4rD6MAp6V_ma-AyWv641M4iiOpe1RyFHeI",
},
{
"text": "This is another embedding example.",
"image": "https://farm4.staticflickr.com/3300/3497460990_11dfb95dd1_z.jpg",
},
]
two_vectors = embeddings.embed_image_text(inputs)

for vector in two_vectors:
print(vector[:5])
[0.11867266893386841, -0.05898813530802727, -0.026179173961281776, -0.10747235268354416, 0.07684746384620667]
[0.026654226705431938, -0.10080841928720474, -0.012732953764498234, -0.04365091398358345, 0.036743905395269394]

API 参考

有关所有 PredictionGuardEmbeddings 功能和配置的详细文档,请查看 API 参考: https://python.langchain.ac.cn/api_reference/community/embeddings/langchain_community.embeddings.predictionguard.PredictionGuardEmbeddings.html


此页是否对您有帮助?