You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Tokenizing is done on all the text each time. This makes the editor very slow. The waiting time grows exponentially. The maximum typing latency should be around ##, now it's much bigger.
The tokenizer is triggered by the syllable marker '་', the phrase marker '།'. and return char. At the moment the tokenizer processes all the document in one go. It should be tokenizing one phrase at a time. A phrase is a string of char separated by a '།'.
Description
Tokenizing is done on all the text each time. This makes the editor very slow. The waiting time grows exponentially. The maximum typing latency should be around ##, now it's much bigger.
The tokenizer is triggered by the syllable marker '་', the phrase marker '།'. and return char. At the moment the tokenizer processes all the document in one go. It should be tokenizing one phrase at a time. A phrase is a string of char separated by a '།'.
How to reproduce
Proposed solution
The text was updated successfully, but these errors were encountered: