| Commit message (Expand) | Author | Age | Files | Lines |
| * | Issue2495: tokenize.untokenize did not insert space between two consecutive s... | Amaury Forgeot d'Arc | 2008-03-27 | 1 | -1/+7 |
|
|
| * | PEP-0318, @decorator-style. In Guido's words: | Anthony Baxter | 2004-08-02 | 1 | -1/+12 |
|
|
| * | Undo Barry's change. This file is not imported, it's fed as input to | Guido van Rossum | 2002-08-29 | 1 | -12/+12 |
|
|
| * | The test_tokenize output has changed slightly, by the addition of some | Barry Warsaw | 2002-08-29 | 1 | -12/+12 |
|
|
| * | Update to reflect new tokenize_test.py | Jeremy Hylton | 2001-04-13 | 1 | -8/+8 |
|
|
| * | Show '\011', '\012', and '\015' as '\t', '\n', '\r' in strings. | Ka-Ping Yee | 2001-01-24 | 1 | -154/+154 |
|
|
| * | Ugh. Sorry. Checked in the wrong file. Please ignore revision 1.3; | Ka-Ping Yee | 2001-01-15 | 1 | -154/+154 |
|
|
| * | Add tokenizer support and tests for u'', U"", uR'', Ur"", etc. | Ka-Ping Yee | 2001-01-15 | 1 | -365/+421 |
|
|
| * | Track changes in tokenize.py | Guido van Rossum | 1998-04-03 | 1 | -30/+30 |
|
|
| * | Tests for tokenize.py (Ka-Ping Yee) | Guido van Rossum | 1997-10-27 | 1 | -0/+592 |
|
|