| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
| |
*are* obsolete; three variables and the maketrans() function are not
(yet) obsolete.
Add a compensating warnings.filterwarnings() call to test_strop.py.
Add this to the NEWS.
|
|
|
|
|
|
|
| |
meaning infinity -- but at least warn about it in the code! I pissed
away a couple hours on this today, and don't wish the same on the next
in line.
Bugfix candidate.
|
|
|
|
|
|
|
| |
means "replace everything". But the string module, string.replace()
amd test_string.py believe a 0 count means "replace nothing".
"Nothing" wins, strop loses.
Bugfix candidate.
|
|
|
|
|
|
| |
out of synch than I realized, and I managed to break replace's "count"
argument when it was 0. All is well again. Maybe.
Bugfix candidate.
|
|
|
|
|
|
| |
mymemXXX stuff, and they were already out of synch. Fix the remaining
bugs in both and get them back in synch.
Bugfix release candidate.
|
|
|
|
|
|
| |
Platform blew up on "123".replace("123", ""). Michael Hudson pinned the
blame on platform malloc(0) returning NULL.
This is a candidate for all bugfix releases.
|
| |
|
|
|
|
|
|
|
|
| |
Add definitions of INT_MAX and LONG_MAX to pyport.h.
Remove includes of limits.h and conditional definitions of INT_MAX
and LONG_MAX elsewhere.
This closes SourceForge patch #101659 and bug #115323.
|
|
|
|
| |
This should match the situation in the 1.6b1 tree.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Py_FatalError() from module initialization functions. The importing
mechanism already checks for PyErr_Occurred() after module importation
and it Does The Right Thing.
Unfortunately, the following either were not compiled or tested by the
regression suite, due to issues with my development platform:
almodule.c
cdmodule.c
mpzmodule.c
puremodule.c
timingmodule.c
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
and a couple of functions that were missed in the previous batches. Not
terribly tested, but very carefully scrutinized, three times.
All these were found by the little findkrc.py that I posted to python-dev,
which means there might be more lurking. Cases such as this:
long
func(a, b)
long a;
long b; /* flagword */
{
and other cases where the last ; in the argument list isn't followed by a
newline and an opening curly bracket. Regexps to catch all are welcome, of
course ;)
|
|
|
|
|
|
|
|
|
|
| |
comments, docstrings or error messages. I fixed two minor things in
test_winreg.py ("didn't" -> "Didn't" and "Didnt" -> "Didn't").
There is a minor style issue involved: Guido seems to have preferred English
grammar (behaviour, honour) in a couple places. This patch changes that to
American, which is the more prominent style in the source. I prefer English
myself, so if English is preferred, I'd be happy to supply a patch myself ;)
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
| |
For more comments, read the patches@python.org archives.
For documentation read the comments in mymalloc.h and objimpl.h.
(This is not exactly what Vladimir posted to the patches list; I've
made a few changes, and Vladimir sent me a fix in private email for a
problem that only occurs in debug mode. I'm also holding back on his
change to main.c, which seems unnecessary to me.)
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Attached you find an update of the Unicode implementation.
The patch is against the current CVS version. I would appreciate
if someone with CVS checkin permissions could check the changes
in.
The patch contains all bugs and patches sent this week and also
fixes a leak in the codecs code and a bug in the free list code
for Unicode objects (which only shows up when compiling Python
with Py_DEBUG; thanks to MarkH for spotting this one).
|
|
|
|
| |
PyArg_ParseTuple() format string arguments as possible.
|
|
|
|
|
| |
parameter match. Error pointed out by François
Pinard <pinard@iro.umontreal.ca> on c.l.py.
|
|
|
|
|
|
| |
converted was a "digit" -- use isalnum(). This test is there only to
guard against "+" or "-" being interpreted as a valid int literal.
Reported by Takahiro Nakayama.
|
|
|
|
| |
the SunPro C compiler to choke. Removed this redundant line.
|
|
|
|
|
| |
I've reformatted it, added a few comments, a test for tabsize <= 0,
and used the AS_STRING macro.
|
| |
|
|
|
|
| |
on BeOS or Windows.
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
| |
previous version of this code would not show the offending input, even
though there was code that attempted this.)
|
| |
|
|
|
|
|
| |
tp_as_sequence or tp_as_mapping structure is made without checking it
for NULL first.
|
| |
|
| |
|
|
|
|
| |
in one of the sequence items.
|
|
|
|
|
|
|
|
| |
type for all functions. However many function call PyArg_Parse() and
need a 0. This is so that when they didn't change anything, the can
do Py_INCREF(args); return args. Reverted this back. For atof(),
there's no reason not to use PyArg_ParseTuple(), so I changed the code
(atoi and atol already used that).
|
| |
|
| |
|
|
|
|
| |
and a little editing my me).
|
|
|
|
|
| |
of the remainder item (last item in list) when maxsplit is < the
number of occurrences.
|
|
|
|
|
|
|
|
| |
maxsplit which is implemented in string.py but wasn't here. The
reference manual doesn't define what happens when maxsplit is negative
or larger than the number of occurrences, but in either case, I
implemented this as all get replaced. Default value is zero which
replaces all occurrences.
|
| |
|
| |
|
| |
|
|
|
|
|
| |
Two new modules fpectl and fpetest.
Surround various and sundry f.p. operations with PyFPE_*_PROTECT macros.
|
|
|
|
|
|
| |
other sequences use the Sequence protocol from the abstract API. The
algorithm has changed so that only one pass through the sequences are
made.
|
|
|
|
|
|
|
|
|
|
|
|
| |
- 'delete' is a C++ keyword; use 'del_table' instead
- apply Py_CHARMASK() to del_table[i] before using it as an index
*** this fixes a bug that was just reported on the list ***
- if the translation didn't make any changes, INCREF and return
the original string
- when del_table is empty or omitted, don't copy the translation
table to a table of ints (should be a bit faster)
Rewrote maketrans() to avoid copying the table (2-3% faster).
|