Skip to main content
Module

x/tokenizer/tokenizer.ts>Tokenizer

⚙️ A simple tokenizer for deno
class Tokenizer
implements IterableIterator<Token>
import { Tokenizer } from "https://deno.land/x/tokenizer@0.1.0/tokenizer.ts";

Tokenizes given source string into tokens

Constructors

new
Tokenizer(rules: Rule[])

Constructs a new Tokenizer

new
Tokenizer(source: string, rules: Rule[])
new
Tokenizer(sourceOrRules: string | Rule[], rulesOrNothing?: Rule[])

Properties

private
_index: number
readonly
done: boolean

Checks if the Tokenizer is done scanning the source string

readonly
index: number

The current index of the Tokenizer in the source string

readonly
rules: Rule[]

The rules that tells the Tokenizer what patterns to look for

readonly
source: string

The string that will be scanned

unexpectedCharacterError: () => void

Methods

private
match(text: string, pattern: Pattern): { match: string; groups: string[]; }
private
scan(): Token | undefined
next(): IteratorResult<Token>

Returns the next scanned Token

reset(): void

Resets the index of the Tokenizer

Tokenizes given string (default is the lexer input) to a Token array

tokenize(source: string)
tokenize(source: string, callback: (token: Token) => void)
tokenize(callback: (token: Token) => void)
[Symbol.iterator](): IterableIterator<Token>