Tokenization
alpaca.internal.lexer.Tokenization
abstract transparent class Tokenization[Ctx <: LexerCtx](using betweenStages: OnTokenMatch[Ctx], errorHandling: ErrorHandling[Ctx], empty: Empty[Ctx]) extends Selectable
The result of compiling a lexer definition.
This abstract class represents a compiled lexer that can tokenize input. It is generated by the lexer macro and provides methods to access tokens and perform tokenization.
Type parameters
- Ctx
-
the global context type
Attributes
- Experimental
- true
- Source
- Tokenization.scala
- Graph
-
- Supertypes
-
trait Selectableclass Objecttrait Matchableclass Any
Members list
In this article
