Package com.gengoai.parsing
Interface Lexer
-
- All Superinterfaces:
Serializable
public interface Lexer extends Serializable
A Lexer tokenizes a string or resource into tokens.
- Author:
- David B. Bracewell
-
-
Method Summary
All Methods Static Methods Instance Methods Abstract Methods Default Methods Modifier and Type Method Description static Lexer
create(TokenDef... tokens)
Creates a regular expression based lexer over the given token definitions.default TokenStream
lex(Resource resource)
Reads from the given resource and tokenizes it into tokensTokenStream
lex(String input)
Tokenizes the input string into tokens
-
-
-
Method Detail
-
lex
TokenStream lex(String input)
Tokenizes the input string into tokens- Parameters:
input
- the input to tokenize- Returns:
- A token stream wrapping the tokenization results
-
lex
default TokenStream lex(Resource resource) throws IOException
Reads from the given resource and tokenizes it into tokens- Parameters:
resource
- the resource to read and tokenize- Returns:
- A token stream wrapping the tokenization results
- Throws:
IOException
- Something went wrong reading from the input resource
-
-