Tokenization core implementation. More...
#include "lexer.h"
Go to the source code of this file.
Functions | |
size_t | tokcnt (const char *const line) |
Counts the number of tokens in a given string (or file content). | |
void | toknz_segtoset (tokset_t *const set, const size_t token_index, const char *const line, const size_t start, const size_t end, const size_t line_no, const tokcat_e category, const size_t column) |
Tokenizes a segment of a line and stores the resulting token in the token set. | |
tokset_t * | toknz (const char *const line) |
Tokenizes a line (or multiple lines of code) into a set of tokens. | |
Tokenization core implementation.
Handles the conversion of source code into token streams: