| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
|
|
|
|
|
|
| |
WarningsRecorder object. This makes the API simpler to use as no special object
must be learned.
Closes issue 3781.
Review by Benjamin Peterson.
|
| |
|
| |
|
| |
|
|
|
|
|
|
| |
on 32-bit systems on 64-bit systems, and vice versa. As a consequence
of the change, Random pickles created by Python 2.6 cannot be loaded
in Python 2.5.
|
|
|
|
|
|
|
|
| |
test.test_support.catch_warning is more full-featured and provides the same
functionality.
Since guard_warnings_filter was added in 2.6 there is no
backwards-compatibility issues.
|
|
|
|
| |
types.
|
|
|
|
| |
Needs to be backported.
|
|
|
|
|
| |
manager that protects warnings.filter from being modified once the context is
exited.
|
|
|
|
| |
work, this time by ugly brute force.
|
|
|
|
| |
Fix the hit and miss style of testing for sets and dicts.
|
| |
|
|
|
|
|
|
| |
Renamed the new generator at Trevor's recommendation.
The name HardwareRandom suggested a bit more than it
delivered (no radioactive decay detectors or such).
|
|
|
|
| |
you leap" approach. Makes the early call to os.urandom() unnecessary.
|
|
|
|
|
| |
* Complete the previous patch by making sure that the MachineRandom
tests are only run when the underlying resource is available.
|
|
|
|
|
| |
* Use it for seeding when it is available.
* Provide an alternate generator based on it.
|
| |
|
|
|
|
|
|
|
|
| |
* Install the unittests, docs, newsitem, include file, and makefile update.
* Exercise the new functions whereever sets.py was being used.
Includes the docs for libfuncs.tex. Separate docs for the types are
forthcoming.
|
|
|
|
|
|
|
| |
* Added C coded getrandbits(k) method that runs in linear time.
* Call the new method from randrange() for ranges >= 2**53.
* Adds a warning for generators not defining getrandbits() whenever they
have a call to randrange() with too large of a population.
|
|
|
|
|
|
|
|
|
|
|
|
| |
random.sample() uses one of two algorithms depending on the ratio of the
sample size to the population size. One of the algorithms accepted any
iterable population argument so long as it defined __len__(). The other
had a stronger requirement that the population argument be indexable.
While it met the documentation specifications which insisted that the
population argument be a sequence, it made random.sample() less usable
with sets. So, the second algorithm was modified to coerce non-indexable
iterables and dictionaries into a tuple before proceeding.
|
|
|
|
|
|
|
| |
The default seed is time.time().
Multiplied by 256 before truncating so that fractional seconds are used.
This way, two successive calls to random.seed() are much more likely
to produce different sequences.
|
|
|
|
|
| |
* Implement __reduce__() to support pickling.
* Add a test case to prove a successful roundtrip through pickle.
|
| |
|
| |
|
|
|
|
|
|
| |
time.sleep(1) sometimes delays for fractionally less than a second
resulting in too short of an interval for C's time.time() function
to create a distinct seed.
|
| |
|
|
|
|
| |
* Use Sets module to more clearly articulate a couple of tests.
|
| |
|
| |
|
| |
|
|
|
|
| |
core generator for random.py.
|
|
|
|
|
|
|
|
|
|
|
| |
imports e.g. test_support must do so using an absolute package name
such as "import test.test_support" or "from test import test_support".
This also updates the README in Lib/test, and gets rid of the
duplicate data dirctory in Lib/test/data (replaced by
Lib/email/test/data).
Now Tim and Jack can have at it. :)
|
|
and the .seed() and .whseed() methods failed to reset it. In other
words, setting the seed didn't completely determine the sequence of
results produced by random.gauss(). It does now. Programs repeatedly
mixing calls to a seed method with calls to gauss() may see different
results now.
Bugfix candidate (random.gauss() has always been broken in this way),
despite that it may change results.
|