| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
line fits in reasonable screen width.
|
| |
|
|
|
|
| |
Made the presence/absence of a semicolon after macros consistent.
|
|
|
|
|
|
|
| |
removed the tricks).
Changed the ENTER/LEAVE_ZLIB macros so as not to create a new block (a
new block is neither necessary nor helpful).
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Apparently this patch (rev 2.41) replaced all the good old "s#"
formats in PyArg_ParseTuple() with "S". Then it did
PyString_FromStringAndSize() to get back the values setup by the
"s#" format. It also incref'd and decref'd the string obtained by
"S" even though the argument tuple had a reference to it.
Replace PyString_AsString() calls with PyString_AS_STRING().
A good rule of thumb -- if you never check the return value of
PyString_AsString() to see if it's NULL, you ought to be using the
macro <wink>.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Many functions used a local variable called return_error, which was
initialized to zero. If an error occurred, it was set to true. Most
of the code paths checked were only executed if return_error was
false. goto is clearer.
The code also seemed to be written under the curious assumption that
calling Py_DECREF() on a local variable would assign the variable to
NULL. As a result, more of the error-exit code paths returned an
object that had a reference count of zero instead of just returning
NULL. Fixed the code to explicitly assign NULL after the DECREF.
A bit more reformatting, but not much.
XXX Need a much better test suite for zlib, since it the current tests
don't exercise any of this broken code.
|
|
|
|
| |
TeX-ified its docstring.
|
| |
|
|
|
|
|
| |
It sets a ZlibError exception, using the msg from the z_stream pointer
if one is available.
|
|
|
|
| |
passed to _beginthread().
|
| |
|
|
|
|
| |
masks any exception, not just AttributeError. Fix this.
|
|
|
|
|
| |
When PyString_FromStringAndSize() and _PyString_Resize() fail, they
set an exception. There's no need to set a new exception.
|
|
|
|
|
|
| |
Consistently indent 4 spaces.
Use whitespace around operators.
Put braces in the right places.
|
|
|
|
|
|
| |
This changes Pythread_start_thread() to return the thread ID, or -1
for an error. (It's technically an incompatible API change, but I
doubt anyone calls it.)
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Mostly by Toby Dickenson and Titus Brown.
Add an optional argument to a decompression object's decompress()
method. The argument specifies the maximum length of the return
value. If the uncompressed data exceeds this length, the excess data
is stored as the unconsumed_tail attribute. (Not to be confused with
unused_data, which is a separate issue.)
Difference from SF patch: Default value for unconsumed_tail is ""
rather than None. It's simpler if the attribute is always a string.
|
|
|
|
| |
suggested in SF patch #424475. Also document exception return.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
object.c, PyObject_Str: Don't try to optimize anything except exact
string objects here; in particular, let str subclasses go thru tp_str,
same as non-str objects. This allows overrides of tp_str to take
effect.
stringobject.c:
+ string_print (str's tp_print): If the argument isn't an exact string
object, get one from PyObject_Str.
+ string_str (str's tp_str): Make a genuine-string copy of the object if
it's of a proper str subclass type. str() applied to a str subclass
that doesn't override __str__ ends up here.
test_descr.py: New str_of_str_subclass() test.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
When an extension imports another extension in its
initXXX() function, the variable _Py_PackageContext is
prematurely reset to NULL. If the outer extension then
calls Py_InitModule(), the extension is installed in
sys.modules without its package name. The
manifestation of this bug is a "SystemError:
_PyImport_FixupExtension: module <package>.<extension>
not loaded".
To fix this, importdl.c just needs to retain the old
value of _Py_PackageContext and restore it after the
initXXX() method is called. The attached patch does this.
This patch applies to Python 2.1.1 and the current CVS.
|
|
|
|
|
| |
size(), parse150(): try int() first, catch OverflowError, fall back to
long().
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
efficient:
- recurse down subclasses only once rather than for each affected
slot;
- short-circuit recursing down subclasses when a subclass has its own
definition of the name that caused the update_slot() calls in the
first place;
- inline collect_ptrs().
|
| |
|
|
|
|
|
| |
specific, and updated some of the comments about the profile hook.
This closes SF bug #471725.
|
|
|
|
| |
Reported by Francesco Trentini.
|
|
|
|
| |
typo.
|
| |
|
| |
|
| |
|
|
|
|
|
| |
hotshot.stats.load(logfilename) returns a pstats.Stats instance, which is
about as compatible as it gets.
|
|
|
|
|
|
| |
changing an application to collect profile data on one part of the
app while still making use of the profiled component, without relying
on side effects.
|
|
|
|
|
|
|
|
|
|
|
| |
Added support for saving the names of the functions observed into the
profile log.
Added support for using the profiler to measure coverage without collecting
timing information (which is the slow part). Coverage logs can also be
substantially smaller than profiling logs where per-line information is
being collected.
Updated comments on the log format; corrected record type values in some
of the record descriptions.
|
|
|
|
|
| |
Add support for extracting function names from the log file, keeping the
extract-names-from-sources support as a fallback.
|
|
|
|
|
| |
distinguish __dict__ and __defined__ any more. In the C structure,
tp_cache takes its place -- but this hasn't been implemented yet.
|
|
|
|
|
| |
Extend tests to cover a few more cases. For cPickle, test several of
the undocumented features.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Raise ValueError when an object contains an arbitrarily nested
reference to itself. (The previous fix just produced invalid
pickles.)
Solution is very much like Py_ReprEnter() and Py_ReprLeave():
fast_save_enter() and fast_save_leave() that tracks the fast_container
limit and keeps a fast_memo of objects currently being pickled.
The cost of the solution is moderately expensive for deeply nested
structures, but it still seems to be faster than normal pickling,
based on tests with deeply nested lists.
Once FAST_LIMIT is exceeded, the new code is about twice as slow as
fast-mode code that doesn't check for recursion. It's still twice as
fast as the normal pickling code. In the absence of deeply nested
structures, I couldn't measure a difference.
|
|
|
|
| |
Remove test code. It's available in Lib/test/picklertester.py.
|
|
|
|
| |
initialize (or use or even know about :-).
|
|
|
|
|
|
|
| |
To whoever who changed a bunch of (PyCFunction) casts to
(PyNoArgsFunction) in PyMethodDef initializers: don't do that. The
cast is to shut the compiler up. The compiler wants the function
pointer initializer to be a PyCFunction.
|
|
|
|
|
| |
Py_TPFLAGS_DYNAMICTYPE bit. There is no longer a performance benefit,
and I don't really see the use case any more.
|
|
|
|
| |
extern-able name.
|
| |
|
|
|
|
| |
on SF bug #467145.
|