summaryrefslogtreecommitdiffstats
path: root/Lib/tokenize.py
diff options
context:
space:
mode:
authorBrett Cannon <bcannon@gmail.com>2011-02-22 03:25:12 (GMT)
committerBrett Cannon <bcannon@gmail.com>2011-02-22 03:25:12 (GMT)
commitf3042782af65fbf68ca7e343357144c676b3fd54 (patch)
treee1589872758b0742df86d7e7e18cc0e1a9a51062 /Lib/tokenize.py
parenteeb114b028f7aef886e0b1b514d58aac9d26bc8c (diff)
downloadcpython-f3042782af65fbf68ca7e343357144c676b3fd54.zip
cpython-f3042782af65fbf68ca7e343357144c676b3fd54.tar.gz
cpython-f3042782af65fbf68ca7e343357144c676b3fd54.tar.bz2
Issue #11074: Make 'tokenize' so it can be reloaded.
The module stored away the 'open' object as found in the global namespace (which fell through to the built-in namespace) since it defined its own 'open'. Problem is that if you reloaded the module it then grabbed the 'open' defined in the previous load, leading to code that infinite recursed. Switched to simply call builtins.open directly.
Diffstat (limited to 'Lib/tokenize.py')
-rw-r--r--Lib/tokenize.py5
1 files changed, 2 insertions, 3 deletions
diff --git a/Lib/tokenize.py b/Lib/tokenize.py
index 506aa6a..f575e9b 100644
--- a/Lib/tokenize.py
+++ b/Lib/tokenize.py
@@ -24,6 +24,7 @@ __author__ = 'Ka-Ping Yee <ping@lfw.org>'
__credits__ = ('GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, '
'Skip Montanaro, Raymond Hettinger, Trent Nelson, '
'Michael Foord')
+import builtins
import re
import sys
from token import *
@@ -335,13 +336,11 @@ def detect_encoding(readline):
return default, [first, second]
-_builtin_open = open
-
def open(filename):
"""Open a file in read only mode using the encoding detected by
detect_encoding().
"""
- buffer = _builtin_open(filename, 'rb')
+ buffer = builtins.open(filename, 'rb')
encoding, lines = detect_encoding(buffer.readline)
buffer.seek(0)
text = TextIOWrapper(buffer, encoding, line_buffering=True)