Skip to main content
Module

x/dendron_exports/deps/micromark.ts

Export your Dendron vault into SSG compatible markdown
Latest
import * as dendronExports from "https://deno.land/x/dendron_exports@v0.2.2/deps/micromark.ts";

Functions

Create an extension for micromark to enable math syntax.

Interfaces

State tracked to compile events as HTML.

Compile options.

State shared between container calls.

A syntax extension changes how markdown is tokenized.

Normalized extenion.

A context object that helps w/ parsing markdown.

Config defining how to parse.

A token: a span of chunks.

A context object that helps w/ tokenizing markdown constructs.

Map of allowed token types.

Type Aliases

Attempt deals with several values, and tries to parse according to those values.

A chunk is either a character code or a slice of a buffer in the form of a string.

A character code.

Serialize micromark events as HTML.

HTML compiler context.

An object describing how to parse a markdown construct.

Several constructs, mapped from their initial codes.

Deal with the character and move to the next.

Enumeration of the content types.

Create a context.

Definition.

Handle the whole document.

A context object to transition the state machine.

Encodings supported by the buffer class.

Open a token.

The start or end of a token amongst other events.

Close a token.

Like a tokenizer, but without ok or nok, and returning void.

A full, filtereed, normalized, extension.

Handle one token.

Token types mapping to handles.

Like a construct, but tokenize does not accept ok or nok.

Like a tokenizer, but without ok or nok.

Type of line ending in markdown.

A filtered, combined, extension.

An HTML extension changes how markdown tokens are serialized.

Configuration.

A location in the document (line/column/offset) and chunk (_index, _bufferIndex).

Guard whether code can come before the construct.

A resolver handles and cleans events coming from tokenize.

The main unit in the state machine: a function that gets a character code and has certain effects.

A tokenize function sets up a state machine to handle character codes streaming in.

Enum of allowed token types.

Contents of the file.