| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The common technique for printing out a pointer has been to cast to a long
and use the "%lx" printf modifier. This is incorrect on Win64 where casting
to a long truncates the pointer. The "%p" formatter should be used instead.
The problem as stated by Tim:
> Unfortunately, the C committee refused to define what %p conversion "looks
> like" -- they explicitly allowed it to be implementation-defined. Older
> versions of Microsoft C even stuck a colon in the middle of the address (in
> the days of segment+offset addressing)!
The result is that the hex value of a pointer will maybe/maybe not have a 0x
prepended to it.
Notes on the patch:
There are two main classes of changes:
- in the various repr() functions that print out pointers
- debugging printf's in the various thread_*.h files (these are why the
patch is large)
Closes SourceForge patch #100505.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch fixes possible overflow in the use of
PyOS_GetLastModificationTime in getmtime.c and Python/import.c.
Currently PyOS_GetLastModificationTime returns a C long. This can
overflow on Win64 where sizeof(time_t) > sizeof(long). Besides it
should logically return a time_t anyway (this patch changes this).
As well, import.c uses PyOS_GetLastModificationTime for .pyc
timestamping. There has been recent discussion about the .pyc header
format on python-dev. This patch adds oveflow checking to import.c so
that an exception will be raised if the modification time
overflows. There are a few other minor 64-bit readiness changes made
to the module as well:
- size_t instead of int or long for function-local buffer and string
length variables
- one buffer overflow check was added (raises an exception on possible
overflow, this overflow chance exists on 32-bit platforms as well), no
other possible buffer overflows existed (from my analysis anyway)
Closes SourceForge patch #100509.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The common technique for printing out a pointer has been to cast to a long
and use the "%lx" printf modifier. This is incorrect on Win64 where casting
to a long truncates the pointer. The "%p" formatter should be used instead.
The problem as stated by Tim:
> Unfortunately, the C committee refused to define what %p conversion "looks
> like" -- they explicitly allowed it to be implementation-defined. Older
> versions of Microsoft C even stuck a colon in the middle of the address (in
> the days of segment+offset addressing)!
The result is that the hex value of a pointer will maybe/maybe not have a 0x
prepended to it.
Notes on the patch:
There are two main classes of changes:
- in the various repr() functions that print out pointers
- debugging printf's in the various thread_*.h files (these are why the
patch is large)
Closes SourceForge patch #100505.
|
|
|
|
| |
another typo caught by Rob Hooft
|
| |
|
|
|
|
|
| |
trent (who broke it in the first place ;-) will come up
with a cleaner solution.
|
|
|
|
| |
warning on Windows.
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch fixes a problem on AIX with the signed int case code in
getargs.c, after Trent Mick's intervention about MIN/MAX overflow
checks. The AIX compiler/optimizer generates bogus code with the
default flags "-g -O" causing test_builtin to fail: int("10", 16) <>
16L. Swapping the two checks in the signed int code makes the problem
go away.
Also, make the error messages fit in 80 char lines in the
source.
|
|
|
|
|
|
|
|
|
|
|
| |
The depth field was never decremented inside w_object(), and it was
never initialized in PyMarshal_WriteObjectToFile().
This caused imports from .pyc files to fil mysteriously when the .pyc
file was written by the broken code -- w_object() would bail out
early, but PyMarshal_WriteObjectToFile() doesn't check the error or
return an error code, and apparently the marshalling code doesn't call
PyErr_Check() either. (That's a separate patch if I feel like it.)
|
|
|
|
| |
tests.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Various small fixes to the builtin module to ensure no buffer
overflows.
- chunk #1:
Proper casting to ensure no truncation, and hence no surprises, in the
comparison.
- chunk #2:
The id() function guarantees a unique return value for different
objects. It does this by returning the pointer to the object. By
returning a PyInt, on Win64 (sizeof(long) < sizeof(void*)) the pointer
is truncated and the guarantee may be proven false. The appropriate
return function is PyLong_FromVoidPtr, this returns a PyLong if that
is necessary to return the pointer without truncation.
[GvR: note that this means that id() can now return a long on Win32
platforms. This *might* break some code...]
- chunk #3:
Ensure no overflow in raw_input(). Granted the user would have to pass
in >2GB of data but it *is* a possible buffer overflow condition.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
As I really do not have anything better to do at the moment, I have written
a patch to Python/marshal.c that prevents Python dumping core when trying
to marshal stack bustingly deep (or recursive) data structure.
It just throws an exception; even slightly clever handling of recursive
data is what pickle is for...
[Fred Drake:] Moved magic constant 5000 to a #define.
This closes SourceForge patch #100645.
|
|
|
|
|
|
| |
the number of children of a node exceeds the max possible value for
the short that is used to count them. The Python runtime converts
this parser error into the SyntaxError "expression too long."
|
| |
|
|
|
|
|
|
| |
number of references on all Python objects. This is only enabled when Py_TRACE_REFS is defined (which includes default debug builds under Windows).
Also removed a redundant cast from sys.getrefcount(), as discussed on the patches list.
|
|
|
|
| |
Fix memory leak in initializing __debug__.
|
|
|
|
|
|
| |
Changed the API names for setting the default encoding.
These are now in line with the other hooks API names
(no underscores).
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
module and into _exceptions.c. This includes all the PyExc_* globals,
the bltin_exc table, init_class_exc(), fini_instances(),
finierrors().
Renamed _PyBuiltin_Init_1() to _PyBuiltin_Init() since the two phase
initializations are necessary any more.
Removed as obsolete _PyBuiltin_Init_2(), _PyBuiltin_Fini_1() and
_PyBuiltin_Fini_2().
|
|
|
|
|
|
|
|
|
|
| |
need two phase init or fini of the builtin module. Change the call of
_PyBuiltin_Init_1() to _PyBuiltin_Init(). Add a call to
init_exceptions().
Py_Finalize(): Don't call _PyBuiltin_Fini_1(). Instead call
fini_exceptions() but move this to before the thread state is
cleared.
|
|
|
|
| |
used to build the fallback string-based exception.
|
|
|
|
|
| |
Calling Sleep(0) for a spinlock can cause a priority inversion, adding
comments to explain what's going on.
|
|
|
|
| |
the notice yet).
|
|
|
|
|
|
| |
Limit the 'b' formatter of PyArg_ParseTuple to valid values of an unsigned
char, i.e. [0,UCHAR_MAX]. It is expected that this is the common usage of 'b'.
An OverflowError is raised if the parsed value is outside this range.
|
|
|
|
|
|
| |
Added APIs to allow setting and querying the system's
current string encoding: sys.set_string_encoding()
and sys.get_string_encoding().
|
|
|
|
|
|
|
| |
Moved some docs to the include file.
Added a NULL check to _PyCodec_Lookup() to make it
core dump safe.
|
|
|
|
|
| |
Fixed docs according to the new behaviour (the Unicode
encoding is no longer fixed to UTF-8).
|
|
|
|
|
| |
Change static slice_index() to extern _PyEval_SliceIndex() (with
different return value interpretation: 0 for failure, 1 for success).
|
|
|
|
|
|
|
|
|
|
| |
Changes the 'b', 'h', and 'i' formatters in PyArg_ParseTuple to raise an
Overflow exception if they overflow (previously they just silently
overflowed).
Changes by Guido: always accept values [0..255] (in addition to
[CHAR_MIN..CHAR_MAX]) for 'b' format; changed some spaces into tabs in
other code.
|
| |
|
|
|
|
| |
referencing an undefined variable, so we better change it back.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
who wrote:
Here's the new version of thread_nt.h. More particular, there is a
new version of thread lock that uses kernel object (e.g. semaphore)
only in case of contention; in other case it simply uses interlocked
functions, which are faster by the order of magnitude. It doesn't
make much difference without threads present, but as soon as thread
machinery initialised and (mostly) the interpreter global lock is on,
difference becomes tremendous. I've included a small script, which
initialises threads and launches pystone. With original thread_nt.h,
Pystone results with initialised threads are twofold worse then w/o
threads. With the new version, only 10% worse. I have used this
patch for about 6 months (with threaded and non-threaded
applications). It works remarkably well (though I'd desperately
prefer Python was free-threaded; I hope, it will soon).
|
| |
|
|
|
|
|
|
|
|
|
|
| |
For more comments, read the patches@python.org archives.
For documentation read the comments in mymalloc.h and objimpl.h.
(This is not exactly what Vladimir posted to the patches list; I've
made a few changes, and Vladimir sent me a fix in private email for a
problem that only occurs in debug mode. I'm also holding back on his
change to main.c, which seems unnecessary to me.)
|
|
|
|
|
|
|
|
|
| |
- When 'import exceptions' fails, don't suggest to use -v to print the traceback;
this doesn't actually work.
- Remove comment about fallback to string exceptions.
- Remove a PyErr_Occurred() check after all is said and done that can
never trigger.
- Remove static function newstdexception() which is no longer called.
|
|
|
|
|
|
|
| |
Added 'u' and 'u#' tags for PyArg_ParseTuple - these turn a
PyUnicodeObject argument into a Py_UNICODE * buffer, or a Py_UNICODE *
buffer plus a length with the '#'. Also added an analog to 'U'
for Py_BuildValue.
|
|
|
|
|
|
|
|
|
|
| |
return 0 (exceptions don't match). This means that if an ImportError
is raised because exceptions.py can't be imported, the interpreter
will exit "cleanly" with an error message instead of just core
dumping.
PyErr_SetFromErrnoWithFilename(), PyErr_SetFromWindowsErrWithFilename():
Don't test on Py_UseClassExceptionsFlag.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
are no longer supported (i.e. -X option is removed).
_PyBuiltin_Init_1(): Don't call initerrors(). This does mean that it
is possible to raise an ImportError before that exception has been
initialized, say because exceptions.py can't be found, or contains
bogosity. See changes to errors.c for how this is handled.
_PyBuiltin_Init_2(): Don't test Py_UseClassExceptionsFlag, just go
ahead and initialize the class-based standard exceptions. If this
fails, we throw a Py_FatalError.
|
|
|
|
| |
API consistency, but nothing sets it or checks it now.
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
Changed all references to the MAGIC constant to use a global
pyc_magic instead. This global is initially set to MAGIC, but can be
changed by the _PyImport_Init() function to provide for
special features implemented in the compiler which are settable
using command line switches and affect the way PYC files are
generated.
Currently this change is only done for the -U flag.
|
|
|
|
| |
Added Py_UnicodeFlag for use by the -U command line option.
|
|
|
|
|
|
|
| |
Support for the new -U command line option option:
with the option enabled the Python compiler
interprets all "..." strings as u"..." (same with r"..." and
ur"...").
|
|
|
|
|
| |
1.6a2 caused by wrong return values in routine allcaps83. [GvR: I
also changed the case for end-s>8 to return 0.]
|
|
|
|
| |
because we've added Unicode marshalling to the repertoire.
|
|
|
|
|
|
|
|
| |
Follow a suggestion in an /*XXX*/ comment [in com_add()] to speed up
compilation by using supplemental dictionaries to keep track of names
and constants, eliminating quadratic behavior. With this patch in
place, the time to import a 5000-line file with lots of constants [at
the global level] is reduced from 20 seconds to under 3 on my system.
|
|
|
|
|
|
|
|
|
|
|
| |
Here's a patch which changes modsupport to add 'u' and 'u#',
to support building Unicode objects from a null-terminated
Py_UNICODE *, and a Py_UNICODE * with length, respectively.
[Conversion from 'U' to 'u' by Fred, based on python-dev comments.]
Note that the use of None for NULL values of the Py_UNICODE* value is
still in; I'm not sure of the conclusion on that issue.
|
|
|
|
|
|
| |
remaining object references if the environment variable PYTHONDUMPREFS
exists. The default behaviour caused problems for background or
otherwise invisible processes that use the debug build of Python.
|