Skip to main content
Module

x/openai/resources/beta/chat/completions.ts>ChatCompletionStream

Deno build of the official Typescript library for the OpenAI API.
Extremely Popular
Go to Latest
class ChatCompletionStream
implements AsyncIterable<ChatCompletionChunk>
extends AbstractChatCompletionRunner<ChatCompletionStreamEvents<ParsedT>, ParsedT>
Re-export
import { ChatCompletionStream } from "https://deno.land/x/openai@v4.56.0/resources/beta/chat/completions.ts";

Constructors

new
ChatCompletionStream(params: ChatCompletionCreateParams | null)

Type Parameters

optional
ParsedT = null

Properties

readonly
currentChatCompletionSnapshot: ChatCompletionSnapshot | undefined

Methods

protected
_createChatCompletion(
client: OpenAI,
params: ChatCompletionCreateParams,
options?: Core.RequestOptions,
): Promise<ParsedChatCompletion<ParsedT>>
protected
_fromReadableStream(readableStream: ReadableStream, options?: Core.RequestOptions): Promise<ChatCompletion>
[Symbol.asyncIterator](this: ChatCompletionStream<ParsedT>): AsyncIterator<ChatCompletionChunk>

Static Methods

createChatCompletion<ParsedT>(
client: OpenAI,
options?: Core.RequestOptions,
): ChatCompletionStream<ParsedT>

Intended for use on the frontend, consuming a stream produced with .toReadableStream() on the backend.

Note that messages sent to the model do not appear in .on('message') in this context.