LangChain

Helicone#

本页介绍如何在LangChain中使用Helicone (opens in a new tab)生态系统。

什么是Helicone?#

Helicone是一个开源 (opens in a new tab)的可观察平台,代理您的OpenAI流量,并为您提供有关您的开支、延迟和使用情况的关键见解。

快速入门#

在您的LangChain环境中,您只需添加以下参数。

export OPENAI_API_BASE="https://oai.hconeai.com/v1"
 

现在前往helicone.ai (opens in a new tab)创建您的帐户,并在我们的仪表板中添加您的OpenAI API密钥以查看您的日志。

如何启用Helicone缓存#

from langchain.llms import OpenAI
import openai
openai.api_base = "https://oai.hconeai.com/v1"
 
llm = OpenAI(temperature=0.9, headers={"Helicone-Cache-Enabled": "true"})
text = "What is a helicone?"
print(llm(text))
 

Helicone缓存文档 (opens in a new tab)

如何使用Helicone自定义属性#

from langchain.llms import OpenAI
import openai
openai.api_base = "https://oai.hconeai.com/v1"
 
llm = OpenAI(temperature=0.9, headers={
        "Helicone-Property-Session": "24",
        "Helicone-Property-Conversation": "support_issue_2",
        "Helicone-Property-App": "mobile",
      })
text = "What is a helicone?"
print(llm(text))
 

Helicone属性文档 (opens in a new tab)