Skip to main content
Using Deno in production at your company? Earn free Deno merch.
Give us feedback
Module

x/openai/resources/mod.ts>ChatCompletionChunk

Deno build of the official Typescript library for the OpenAI API.
Extremely Popular
Go to Latest
namespace ChatCompletionChunk
import { ChatCompletionChunk } from "https://deno.land/x/openai@v4.52.0/resources/mod.ts";
interface ChatCompletionChunk
import { type ChatCompletionChunk } from "https://deno.land/x/openai@v4.52.0/resources/mod.ts";

Represents a streamed chunk of a chat completion response returned by model, based on the provided input.

Properties

id: string

A unique identifier for the chat completion. Each chunk has the same ID.

A list of chat completion choices. Can contain more than one elements if n is greater than 1. Can also be empty for the last chunk if you set stream_options: {"include_usage": true}.

created: number

The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.

model: string

The model to generate the completion.

object: "chat.completion.chunk"

The object type, which is always chat.completion.chunk.

optional
service_tier: "scale" | "default" | null

The service tier used for processing the request. This field is only included if the service_tier parameter is specified in the request.

optional
system_fingerprint: string

This fingerprint represents the backend configuration that the model runs with. Can be used in conjunction with the seed request parameter to understand when backend changes have been made that might impact determinism.

optional
usage: CompletionsAPI.CompletionUsage

An optional field that will only be present when you set stream_options: {"include_usage": true} in your request. When present, it contains a null value except for the last chunk which contains the token usage statistics for the entire request.