diff options
author | Serhiy Storchaka <storchaka@gmail.com> | 2023-11-02 08:42:58 (GMT) |
---|---|---|
committer | GitHub <noreply@github.com> | 2023-11-02 08:42:58 (GMT) |
commit | a12f624a9dc1c44bb20a20b13fd164c14b987892 (patch) | |
tree | cdb4981b35e460aadca8f884ee696e075e932a1b /Python/Python-tokenize.c | |
parent | 229f44d353c71185414a072017f46f125676bdd6 (diff) | |
download | cpython-a12f624a9dc1c44bb20a20b13fd164c14b987892.zip cpython-a12f624a9dc1c44bb20a20b13fd164c14b987892.tar.gz cpython-a12f624a9dc1c44bb20a20b13fd164c14b987892.tar.bz2 |
Remove unnecessary includes (GH-111633)
Diffstat (limited to 'Python/Python-tokenize.c')
-rw-r--r-- | Python/Python-tokenize.c | 1 |
1 files changed, 0 insertions, 1 deletions
diff --git a/Python/Python-tokenize.c b/Python/Python-tokenize.c index 83b4aa4..364fe55 100644 --- a/Python/Python-tokenize.c +++ b/Python/Python-tokenize.c @@ -4,7 +4,6 @@ #include "../Parser/lexer/lexer.h" #include "../Parser/tokenizer/tokenizer.h" #include "../Parser/pegen.h" // _PyPegen_byte_offset_to_character_offset() -#include "../Parser/pegen.h" // _PyPegen_byte_offset_to_character_offset() static struct PyModuleDef _tokenizemodule; |