Skip to main content

Class: OpenAI

OpenAI LLM implementation

Implements

Constructors

constructor

new OpenAI(init?)

Parameters

NameType
init?Partial<OpenAI> & { azure?: AzureOpenAIConfig }

Defined in

llm/LLM.ts:102

Properties

additionalChatOptions

Optional additionalChatOptions: Omit<Partial<CompletionCreateParams>, "model" | "temperature" | "max_tokens" | "messages" | "top_p" | "streaming">

Defined in

llm/LLM.ts:85


additionalSessionOptions

Optional additionalSessionOptions: Omit<Partial<ClientOptions>, "apiKey" | "timeout" | "maxRetries">

Defined in

llm/LLM.ts:95


apiKey

Optional apiKey: string = undefined

Defined in

llm/LLM.ts:91


callbackManager

Optional callbackManager: CallbackManager

Defined in

llm/LLM.ts:100


maxRetries

maxRetries: number

Defined in

llm/LLM.ts:92


maxTokens

Optional maxTokens: number

Defined in

llm/LLM.ts:84


model

model: "gpt-3.5-turbo" | "gpt-3.5-turbo-16k" | "gpt-4" | "gpt-4-32k"

Defined in

llm/LLM.ts:81


session

session: OpenAISession

Defined in

llm/LLM.ts:94


temperature

temperature: number

Defined in

llm/LLM.ts:82


timeout

Optional timeout: number

Defined in

llm/LLM.ts:93


topP

topP: number

Defined in

llm/LLM.ts:83

Methods

chat

chat(messages, parentEvent?): Promise<ChatResponse>

Get a chat response from the LLM

Parameters

NameType
messagesChatMessage[]
parentEvent?Event

Returns

Promise<ChatResponse>

Implementation of

LLM.chat

Defined in

llm/LLM.ts:173


complete

complete(prompt, parentEvent?): Promise<ChatResponse>

Get a prompt completion from the LLM

Parameters

NameTypeDescription
promptstringthe prompt to complete
parentEvent?Event-

Returns

Promise<ChatResponse>

Implementation of

LLM.complete

Defined in

llm/LLM.ts:214


mapMessageType

mapMessageType(messageType): "function" | "user" | "assistant" | "system"

Parameters

NameType
messageTypeMessageType

Returns

"function" | "user" | "assistant" | "system"

Defined in

llm/LLM.ts:156