| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
| |
are moving into the core; with these changes, it will be possible for the
exception to be raised without the weakref module ever being imported.
|
|
|
|
| |
a hard-coded type check.
|
| |
|
|
|
|
|
|
|
|
|
| |
The new profiler event stream includes a "return" event even when an
exception is being propogated, but the machinery that called the profile
hook did not save & restore the exception. In debug mode, the exception
was detected during the execution of the profile callback, which did not
have the proper internal flags set for the exception. Saving & restoring
the exception state solves the problem.
|
|
|
|
| |
on IRIX 6.5).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The profiler does not need to know anything about the exception state,
so we no longer call it when an exception is raised. We do, however,
make sure we *always* call the profiler when we exit a frame. This
ensures that timing events are more easily isolated by a profiler and
finally clauses that do a lot of work don't have their time
mis-allocated.
When an exception is propogated out of the frame, the C callback for
the profiler now receives a PyTrace_RETURN event with an arg of NULL;
the Python-level profile hook function will see a 'return' event with
an arg of None. This means that from Python it is impossible for the
profiler to determine if the frame exited with an exception or if it
returned None, but this doesn't matter for profiling. A C-based
profiler could tell the difference, but this doesn't seem important.
ceval.c:eval_frame(): Simplify the code in two places so that the
profiler is called for every exit from a frame
and not for exceptions.
sysmodule.c:profile_trampoline(): Make sure we don't expose Python
code to NULL; use None instead.
|
|
|
|
|
|
|
| |
Unknown whether this fixes it.
- stringobject.c, PyString_FromFormatV: don't assume that va_list is of
a type that can be copied via an initializer.
- errors.c, PyErr_Format: add a va_end() to balance the va_start().
|
|
|
|
|
|
| |
Replaced 3 instances of "iter() of non-sequence" with
"iteration over non-sequence".
Restored "unpack non-sequence" for stuff like "a, b = 1".
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
If a new exception occurs while an exception instance is being
created, try harder to make sure there is a traceback. If the
original exception had a traceback associated with it and the new
exception does not, keep the old exception.
Of course, callers to PyErr_NormalizeException() must still be
prepared to have tb set to NULL.
XXX This isn't an ideal solution, but it's better than no traceback at
all. It occurs if, for example, the exception occurs when the call to
the constructor fails before any Python code is executed. Guido
suggests that it there is Python code that was about to be executed
-- but wasn't, say, because it was called with the wrong number of
arguments -- then we should point at the first line of the code object
anyway.
|
|
|
|
|
|
|
|
|
|
|
|
| |
It's possible for PyErr_NormalizeException() to set the traceback
pointer to NULL. I'm not sure how to provoke this directly from
Python, although it may be possible. The error occurs when an
exception is set using PyErr_SetObject() and another exception occurs
while PyErr_NormalizeException() is creating the exception instance.
XXX As a result of this change, it's possible for an exception to
occur but sys.last_traceback to be left undefined. Not sure if this
is a problem.
|
|
|
|
|
|
|
|
|
| |
popped frame-block. What an embarrassing bug! Especially for Jeremy, since
he accepted the patch :-)
This fixes SF bugs #463359 and #462937, and possibly other, *very* obscure
bugs with very deeply nested loops that continue the loop and then break out
of it or raise an exception.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
compatibility, this required all places where an array of "struct
memberlist" structures was declared that is referenced from a type's
tp_members slot to change the type of the structure to PyMemberDef;
"struct memberlist" is now only used by old code that still calls
PyMember_Get/Set. The code in PyObject_GenericGetAttr/SetAttr now
calls the new APIs PyMember_GetOne/SetOne, which take a PyMemberDef
argument.
As examples, I added actual docstrings to the attributes of a few
types: file, complex, instance method, super, and xxsubtype.spamlist.
Also converted the symtable to new style getattr.
|
|
|
|
|
|
| |
Renamed the 'readonly' field to 'flags' and defined some new flag
bits: READ_RESTRICTED and WRITE_RESTRICTED, as well as a shortcut
RESTRICTED that means both.
|
|
|
|
|
| |
The chief effects are to make dir() do something useful and supply
them with an __class__.
|
|
|
|
|
| |
builtin function); Guido pointed out that it could be just another
name in the __builtin__ dict for the file constructor now.
|
| |
|
|
|
|
|
|
| |
and trace functions; this now declares that None will be passed for the
"call" event.
This closes SF bug/suggestion #460315.
|
|
|
|
| |
Preliminary support. What's here works, but needs fine-tuning.
|
|
|
|
|
|
|
|
|
| |
backwards compatibility. When using the class of the first base as
the metaclass, use its __class__ attribute in preference over its
ob_type slot. This ensures that we can still use classic classes as
metaclasse, as shown in the original "Metaclasses" essay. This also
makes all the examples in Demo/metaclasses/ work again (maybe these
should be turned into a test suite?).
|
|
|
|
|
|
| |
parameter for the return string (as unix pathnames are not limited
by the 255 char pstring limit).
Implemented the function for MachO-Python, where it returns unix pathnames.
|
|
|
|
|
|
|
|
|
|
| |
by bbrox@bbrox.org / lionel.ulmer@free.fr.
This adds a configure check and if all goes well turns on the
PTHREAD_SCOPE_SYSTEM thread attribute for new threads.
This should remove the need to add tiny sleeps at the start of threads
to allow other threads to be scheduled.
|
|
|
|
|
|
|
|
|
| |
Reported by Fredrik Lundh on python-dev.
The conversimple() code that handles Unicode arguments and converts
them to the default encoding now calls converterr() with the original
Unicode argument instead of the NULL returned by the failed encoding
attempt.
|
|
|
|
|
|
|
|
|
|
| |
com_factor(): when a unary minus is attached to a float or imaginary zero,
don't optimize the UNARY_MINUS opcode away: the const dict can't
distinguish between +0.0 and -0.0, so ended up treating both like the
first one added to it. Optimizing UNARY_PLUS away isn't a problem.
(BTW, I already uploaded the 2.2a3 Windows installer, and this isn't
important enough to delay the release.)
|
| |
|
| |
|
|
|
|
| |
found it necessary to warn about.
|
|
|
|
| |
__builtin__.dir(). Moved the guts from bltinmodule.c to object.c.
|
|
|
|
|
| |
way it's called each time a generator is resumed. The tracing of normal
functions should be unaffected by this change.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
of PyMapping_Keys because we know we have a real dict. Tolerate that
objects may have an attr named "__dict__" that's not a dict (Py_None
popped up during testing).
test_descr.py, test_dir(): Test the new classic-class behavior; beef up
the new-style class test similarly.
test_pyclbr.py, checkModule(): dir(C) is no longer a synonym for
C.__dict__.keys() when C is a classic class (looks like the same thing
that burned distutils! -- should it be *made* a synoym again? Then it
would be inconsistent with new-style class behavior.).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bag. It's clearly wrong for classic classes, at heart because a classic
class doesn't have a __class__ attribute, and I'm unclear on whether
that's feature or bug. I'll repair this once I find out (in the
meantime, dir() applied to classic classes won't find the base classes,
while dir() applied to a classic-class instance *will* find the base
classes but not *their* base classes).
Please give the new dir() a try and see whether you love it or hate it.
The new dir([]) behavior is something I could come to love. Here's
something to hate:
>>> class C:
... pass
...
>>> c = C()
>>> dir(c)
['__doc__', '__module__']
>>>
The idea that an instance has a __doc__ attribute is jarring (of course
it's really c.__class__.__doc__ == C.__doc__; likewise for __module__).
OTOH, the code already has too many special cases, and dir(x) doesn't
have a compelling or clear purpose when x isn't a module.
|
|
|
|
| |
Moved the declarations to pymactoolbox.h.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
PEP 238. Changes:
- add a new flag variable Py_DivisionWarningFlag, declared in
pydebug.h, defined in object.c, set in main.c, and used in
{int,long,float,complex}object.c. When this flag is set, the
classic division operator issues a DeprecationWarning message.
- add a new API PyRun_SimpleStringFlags() to match
PyRun_SimpleString(). The main() function calls this so that
commands run with -c can also benefit from -Dnew.
- While I was at it, I changed the usage message in main() somewhat:
alphabetized the options, split it in *four* parts to fit in under
512 bytes (not that I still believe this is necessary -- doc strings
elsewhere are much longer), and perhaps most visibly, don't display
the full list of options on each command line error. Instead, the
full list is only displayed when -h is used, and otherwise a brief
reminder of -h is displayed. When -h is used, write to stdout so
that you can do `python -h | more'.
Notes:
- I don't want to use the -W option to control whether the classic
division warning is issued or not, because the machinery to decide
whether to display the warning or not is very expensive (it involves
calling into the warnings.py module). You can use -Werror to turn
the warnings into exceptions though.
- The -Dnew option doesn't select future division for all of the
program -- only for the __main__ module. I don't know if I'll ever
change this -- it would require changes to the .pyc file magic
number to do it right, and a more global notion of compiler flags.
- You can usefully combine -Dwarn and -Dnew: this gives the __main__
module new division, and warns about classic division everywhere
else.
|
|
|
|
|
| |
affect nodes without another operator. This was causing negated
exponentiations to drop the exponentiation. This closes SF bug #456756.
|
|
|
|
|
| |
PyInt_Check() succeeds. That returns true for subtypes of int, which
may override __add__ or __sub__.
|
|
|
|
| |
restricted execution mode.
|
| |
|
| |
|
|
|
|
| |
when that test is doomed to deadlock.
|
|
|
|
|
|
|
|
|
|
|
| |
pyport.h: typedef a new Py_intptr_t type.
DELICATE ASSUMPTION: That HAVE_UINTPTR_T implies intptr_t is
available as well as uintptr_t. If that turns out not to be
true, things must get uglier (C99 wants both, so I think it's
an assumption we're *likely* to get away with).
thread_nt.h, PyThread_start_new_thread: MS _beginthread is documented
as returning unsigned long; no idea why uintptr_t was being used.
Others: Always use Py_[u]intptr_t, never [u]intptr_t directly.
|
|
|
|
|
| |
not enough for Python. Increased the stacksize to a (somewhat arbitrary)
64KB.
|
|
|
|
| |
ints, convert to PyLong (rather than throwing away the high-order 32 bits).
|
|
|
|
| |
rather than a type equality test.
|
|
|
|
| |
because of overflow, generate a long instead.
|
|
|
|
| |
internal name (_AE). This can slow things down (once) but it's the only way I can get things to work on OSX, OS9 dynamically loaded and OS9 frozen.
|
|
|
|
|
| |
PyString_FromFormat() since it's much more generally useful than
just for exceptions.
|
| |
|
|
|
|
| |
This implements the 'getset' class from test_binop.py.
|
| |
|
|
|
|
|
| |
raise the exception here -- call the generic function (which may
convert the arguments to long and try again).
|
|
|
|
| |
are overflowing and a long int operation is substituted.
|