Skip to content

Commit

Permalink
fix: parser adding addition tokens for no reason
Browse files Browse the repository at this point in the history
  • Loading branch information
THEGOLDENPRO committed Sep 15, 2024
1 parent 4d0f3d4 commit 53211ad
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion osaker/lexer.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ def tokenize(self, string: str) -> List[Token]:
match = None

for token_type in self.tokens_compiled:
match = self.tokens_compiled[token_type].match(string, position)
match = self.tokens_compiled[token_type].match(line, position)

if match:
token_value = match.group(0)
Expand Down

0 comments on commit 53211ad

Please sign in to comment.