跳到主要内容
Open In ColabOpen on GitHub

Oracle Cloud Infrastructure Generative AI

Oracle Cloud Infrastructure (OCI) Generative AI 是一项完全托管的服务,提供一套最先进、可定制的大型语言模型 (LLM),涵盖广泛的用例,并通过单个 API 提供。使用 OCI Generative AI 服务,您可以访问即用型预训练模型,或者基于您在专用 AI 集群上的自有数据创建和托管您自己微调的自定义模型。有关该服务和 API 的详细文档,请点击此处此处

本笔记本解释了如何将 OCI 的 Genrative AI 模型与 LangChain 一起使用。

先决条件

我们将需要安装 oci sdk

!pip install -U oci

OCI Generative AI API 终端节点

https://inference.generativeai.us-chicago-1.oci.oraclecloud.com

身份验证

此 langchain 集成支持的身份验证方法有

  1. API 密钥
  2. 会话令牌
  3. 实例主体
  4. 资源主体

这些方法遵循此处详细介绍的标准 SDK 身份验证方法。

用法

from langchain_community.embeddings import OCIGenAIEmbeddings

# use default authN method API-key
embeddings = OCIGenAIEmbeddings(
model_id="MY_EMBEDDING_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
)


query = "This is a query in English."
response = embeddings.embed_query(query)
print(response)

documents = ["This is a sample document", "and here is another one"]
response = embeddings.embed_documents(documents)
print(response)
API 参考:OCIGenAIEmbeddings
# Use Session Token to authN
embeddings = OCIGenAIEmbeddings(
model_id="MY_EMBEDDING_MODEL",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="MY_OCID",
auth_type="SECURITY_TOKEN",
auth_profile="MY_PROFILE", # replace with your profile name
auth_file_location="MY_CONFIG_FILE_LOCATION", # replace with file location where profile name configs present
)


query = "This is a sample query"
response = embeddings.embed_query(query)
print(response)

documents = ["This is a sample document", "and here is another one"]
response = embeddings.embed_documents(documents)
print(response)

此页是否对您有帮助?