Class: OpenAI
OpenAI LLM implementation
Implements
Constructors
constructor
• new OpenAI(init?
)
Parameters
Name | Type |
---|---|
init? | Partial <OpenAI > & { azure? : AzureOpenAIConfig } |
Defined in
Properties
additionalChatOptions
• Optional
additionalChatOptions: Omit
<Partial
<CompletionCreateParams
>, "model"
| "temperature"
| "max_tokens"
| "messages"
| "top_p"
| "streaming"
>
Defined in
additionalSessionOptions
• Optional
additionalSessionOptions: Omit
<Partial
<ClientOptions
>, "apiKey"
| "timeout"
| "maxRetries"
>
Defined in
apiKey
• Optional
apiKey: string
= undefined
Defined in
callbackManager
• Optional
callbackManager: CallbackManager
Defined in
maxRetries
• maxRetries: number
Defined in
maxTokens
• Optional
maxTokens: number
Defined in
model
• model: "gpt-3.5-turbo"
| "gpt-3.5-turbo-16k"
| "gpt-4"
| "gpt-4-32k"
Defined in
session
• session: OpenAISession
Defined in
temperature
• temperature: number
Defined in
timeout
• Optional
timeout: number
Defined in
topP
• topP: number
Defined in
Methods
chat
▸ chat(messages
, parentEvent?
): Promise
<ChatResponse
>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
messages | ChatMessage [] |
parentEvent? | Event |
Returns
Promise
<ChatResponse
>
Implementation of
Defined in
complete
▸ complete(prompt
, parentEvent?
): Promise
<ChatResponse
>
Get a prompt completion from the LLM
Parameters
Name | Type | Description |
---|---|---|
prompt | string | the prompt to complete |
parentEvent? | Event | - |
Returns
Promise
<ChatResponse
>
Implementation of
Defined in
mapMessageType
▸ mapMessageType(messageType
): "function"
| "user"
| "assistant"
| "system"
Parameters
Name | Type |
---|---|
messageType | MessageType |
Returns
"function"
| "user"
| "assistant"
| "system"