This abstract class represents a compiled lexer that can tokenize input. It is generated by the lexer macro and provides methods to access tokens and perform tokenization.
final def tokenize(input: CharSequence): (ctx: Ctx, lexemes: List[Lexeme])
Tokenizes the input character sequence.
Tokenizes the input character sequence.
Processes the input from start to finish, matching tokens and building a list of lexems. Throws a RuntimeException if an unexpected character is encountered.