AzureMLChatOnlineEndpoint
Azure 机器学习 是一个用于构建、训练和部署机器学习模型的平台。用户可以在模型目录中探索要部署的模型类型,该目录提供了来自不同提供商的基础模型和通用模型。
通常,您需要部署模型才能使用其预测(推理)。在
Azure 机器学习
中,在线终结点 用于部署这些模型并进行实时服务。它们基于终结点
和部署
的概念,使您可以将生产工作负载的接口与提供服务的实现分离。
此笔记本介绍了如何使用托管在Azure 机器学习终结点
上的聊天模型。
from langchain_community.chat_models.azureml_endpoint import AzureMLChatOnlineEndpoint
API 参考:AzureMLChatOnlineEndpoint
设置
您必须在 Azure ML 上部署模型 或部署到 Azure AI studio 并获取以下参数
endpoint_url
:终结点提供的 REST 终结点 URL。endpoint_api_type
:当将模型部署到**专用终结点**(托管管理基础设施)时,使用endpoint_type='dedicated'
。当使用**即付即用**产品(模型即服务)部署模型时,使用endpoint_type='serverless'
。endpoint_api_key
:终结点提供的 API 密钥
内容格式化程序
content_formatter
参数是用于转换 AzureML 终结点的请求和响应以匹配所需模式的处理程序类。由于模型目录中存在各种模型,每个模型的处理数据方式可能彼此不同,因此提供了一个ContentFormatterBase
类,允许用户根据自己的喜好转换数据。提供了以下内容格式化程序
CustomOpenAIChatContentFormatter
:格式化像 LLaMa2-chat 这样的模型的请求和响应数据,这些模型遵循 OpenAI API 规范的请求和响应。
注意:langchain.chat_models.azureml_endpoint.LlamaChatContentFormatter
即将弃用,并替换为langchain.chat_models.azureml_endpoint.CustomOpenAIChatContentFormatter
。
您可以根据langchain_community.llms.azureml_endpoint.ContentFormatterBase
类实现特定于模型的自定义内容格式化程序。
示例
以下部分包含有关如何使用此类的示例
示例:使用实时终结点的聊天完成
from langchain_community.chat_models.azureml_endpoint import (
AzureMLEndpointApiType,
CustomOpenAIChatContentFormatter,
)
from langchain_core.messages import HumanMessage
chat = AzureMLChatOnlineEndpoint(
endpoint_url="https://<your-endpoint>.<your_region>.inference.ml.azure.com/score",
endpoint_api_type=AzureMLEndpointApiType.dedicated,
endpoint_api_key="my-api-key",
content_formatter=CustomOpenAIChatContentFormatter(),
)
response = chat.invoke(
[HumanMessage(content="Will the Collatz conjecture ever be solved?")]
)
response
AIMessage(content=' The Collatz Conjecture is one of the most famous unsolved problems in mathematics, and it has been the subject of much study and research for many years. While it is impossible to predict with certainty whether the conjecture will ever be solved, there are several reasons why it is considered a challenging and important problem:\n\n1. Simple yet elusive: The Collatz Conjecture is a deceptively simple statement that has proven to be extraordinarily difficult to prove or disprove. Despite its simplicity, the conjecture has eluded some of the brightest minds in mathematics, and it remains one of the most famous open problems in the field.\n2. Wide-ranging implications: The Collatz Conjecture has far-reaching implications for many areas of mathematics, including number theory, algebra, and analysis. A solution to the conjecture could have significant impacts on these fields and potentially lead to new insights and discoveries.\n3. Computational evidence: While the conjecture remains unproven, extensive computational evidence supports its validity. In fact, no counterexample to the conjecture has been found for any starting value up to 2^64 (a number', additional_kwargs={}, example=False)
示例:使用即付即用部署(模型即服务)的聊天完成
chat = AzureMLChatOnlineEndpoint(
endpoint_url="https://<your-endpoint>.<your_region>.inference.ml.azure.com/v1/chat/completions",
endpoint_api_type=AzureMLEndpointApiType.serverless,
endpoint_api_key="my-api-key",
content_formatter=CustomOpenAIChatContentFormatter,
)
response = chat.invoke(
[HumanMessage(content="Will the Collatz conjecture ever be solved?")]
)
response
如果需要将其他参数传递给模型,请使用model_kwargs
参数
chat = AzureMLChatOnlineEndpoint(
endpoint_url="https://<your-endpoint>.<your_region>.inference.ml.azure.com/v1/chat/completions",
endpoint_api_type=AzureMLEndpointApiType.serverless,
endpoint_api_key="my-api-key",
content_formatter=CustomOpenAIChatContentFormatter,
model_kwargs={"temperature": 0.8},
)
也可以在调用期间传递参数
response = chat.invoke(
[HumanMessage(content="Will the Collatz conjecture ever be solved?")],
max_tokens=512,
)
response