summaryrefslogtreecommitdiffstats
path: root/Lib/tokenize.py
diff options
context:
space:
mode:
authorGuido van Rossum <guido@python.org>2007-05-15 18:46:22 (GMT)
committerGuido van Rossum <guido@python.org>2007-05-15 18:46:22 (GMT)
commit1bc535dc7854b6be009a6bf3413a3a470e3fe749 (patch)
tree7a43646468849a9ae624bd4314ff26b7b0e30f21 /Lib/tokenize.py
parent360e4b8fb19f34360093bc15ef9aad13115a6069 (diff)
downloadcpython-1bc535dc7854b6be009a6bf3413a3a470e3fe749.zip
cpython-1bc535dc7854b6be009a6bf3413a3a470e3fe749.tar.gz
cpython-1bc535dc7854b6be009a6bf3413a3a470e3fe749.tar.bz2
Merged revisions 55328-55341 via svnmerge from
svn+ssh://pythondev@svn.python.org/python/branches/p3yk ........ r55329 | brett.cannon | 2007-05-14 16:36:56 -0700 (Mon, 14 May 2007) | 3 lines Implement the removal of tuple parameter unpacking (PEP 3113). Thanks, Tony Lownds for the patch. ........ r55331 | neal.norwitz | 2007-05-14 16:40:30 -0700 (Mon, 14 May 2007) | 1 line Update to use Python 3.0 ........ r55332 | brett.cannon | 2007-05-14 16:47:18 -0700 (Mon, 14 May 2007) | 2 lines Mention PEP 3113. And thanks to Tony Lownds for the PEP 3113 patch. ........ r55333 | neal.norwitz | 2007-05-14 16:57:06 -0700 (Mon, 14 May 2007) | 1 line Fix exception printing (no more exceptions module) ........ r55334 | neal.norwitz | 2007-05-14 17:11:10 -0700 (Mon, 14 May 2007) | 1 line Remove popen* functions from os ........ r55335 | neal.norwitz | 2007-05-14 18:03:38 -0700 (Mon, 14 May 2007) | 1 line Get rid of most of popen. There are still some uses I need to cleanup. ........ r55336 | neal.norwitz | 2007-05-14 21:11:34 -0700 (Mon, 14 May 2007) | 1 line Remove a few more remnants of the compiler package ........ r55337 | neal.norwitz | 2007-05-14 22:28:27 -0700 (Mon, 14 May 2007) | 1 line Get test_[cx]pickle working on 64-bit platforms (avoid overflow int/long) ........
Diffstat (limited to 'Lib/tokenize.py')
-rw-r--r--Lib/tokenize.py3
1 files changed, 2 insertions, 1 deletions
diff --git a/Lib/tokenize.py b/Lib/tokenize.py
index 1a72d6f..e94d7b9 100644
--- a/Lib/tokenize.py
+++ b/Lib/tokenize.py
@@ -131,7 +131,8 @@ class TokenError(Exception): pass
class StopTokenizing(Exception): pass
-def printtoken(type, token, (srow, scol), (erow, ecol), line): # for testing
+def printtoken(type, token, startrowcol, endrowcol, line): # for testing
+ (srow, scol), (erow, ecol) = startrowcol, endrowcol
print("%d,%d-%d,%d:\t%s\t%s" % \
(srow, scol, erow, ecol, tok_name[type], repr(token)))