| Package | Description |
|---|---|
| gw.lang | |
| gw.lang.gosuc | |
| gw.lang.parser |
| Modifier and Type | Method and Description |
|---|---|
static ISourceCodeTokenizer |
GosuShop.createSourceCodeTokenizer(CharSequence code) |
ISourceCodeTokenizer |
IGosuShop.createSourceCodeTokenizer(CharSequence code) |
static ISourceCodeTokenizer |
GosuShop.createSourceCodeTokenizer(CharSequence code,
boolean bTemplate) |
ISourceCodeTokenizer |
IGosuShop.createSourceCodeTokenizer(CharSequence code,
boolean bTemplate) |
static ISourceCodeTokenizer |
GosuShop.createSourceCodeTokenizer(Reader reader) |
ISourceCodeTokenizer |
IGosuShop.createSourceCodeTokenizer(Reader reader) |
| Modifier and Type | Method and Description |
|---|---|
static ITokenizerInstructor |
GosuShop.createTemplateInstructor(ISourceCodeTokenizer tokenizer) |
ITokenizerInstructor |
IGosuShop.createTemplateInstructor(ISourceCodeTokenizer tokenizer) |
static ITokenizerInstructor |
GosuShop.createTemplateTokenizerInstructor(ISourceCodeTokenizer tokenizer) |
ITokenizerInstructor |
IGosuShop.createTemplateTokenizerInstructor(ISourceCodeTokenizer tokenizer) |
| Modifier and Type | Method and Description |
|---|---|
ISourceCodeTokenizer |
GosucProjectParser.getTokenizer() |
| Modifier and Type | Method and Description |
|---|---|
ISourceCodeTokenizer |
ISource.getTokenizer() |
ISourceCodeTokenizer |
FileSource.getTokenizer() |
ISourceCodeTokenizer |
IGosuParser.getTokenizer() |
ISourceCodeTokenizer |
StringSource.getTokenizer() |
ISourceCodeTokenizer |
ISourceCodeTokenizer.lightweightRestore() |
| Modifier and Type | Method and Description |
|---|---|
ITokenizerInstructor |
ITokenizerInstructor.createNewInstance(ISourceCodeTokenizer tokenizer) |
void |
ISource.setTokenizer(ISourceCodeTokenizer tokenizer) |
void |
FileSource.setTokenizer(ISourceCodeTokenizer tokenizer) |
void |
ITokenizerInstructor.setTokenizer(ISourceCodeTokenizer tokenizer) |
void |
IGosuParser.setTokenizer(ISourceCodeTokenizer tokenizer) |
void |
StringSource.setTokenizer(ISourceCodeTokenizer tokenizer) |
Copyright © 2022. All rights reserved.