summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorAntoine Pitrou <solipsis@pitrou.net>2011-01-06 16:31:28 (GMT)
committerAntoine Pitrou <solipsis@pitrou.net>2011-01-06 16:31:28 (GMT)
commit003428158bc6cb2bcac766b4394d1519172c22cb (patch)
tree3600fdfd6da17133e1f376bf074116d9b6ddf924
parentf8dc9ca84e112203ddcf4f8499acd65b911bfa80 (diff)
downloadcpython-003428158bc6cb2bcac766b4394d1519172c22cb.zip
cpython-003428158bc6cb2bcac766b4394d1519172c22cb.tar.gz
cpython-003428158bc6cb2bcac766b4394d1519172c22cb.tar.bz2
Elaborate about the GIL.
-rw-r--r--Doc/glossary.rst36
-rw-r--r--Doc/library/threading.rst12
2 files changed, 35 insertions, 13 deletions
diff --git a/Doc/glossary.rst b/Doc/glossary.rst
index 32ad8e3..9ec61de 100644
--- a/Doc/glossary.rst
+++ b/Doc/glossary.rst
@@ -102,9 +102,10 @@ Glossary
See :pep:`343`.
CPython
- The canonical implementation of the Python programming language. The
- term "CPython" is used in contexts when necessary to distinguish this
- implementation from others such as Jython or IronPython.
+ The canonical implementation of the Python programming language, as
+ distributed on `python.org <http://python.org>`_. The term "CPython"
+ is used when necessary to distinguish this implementation from others
+ such as Jython or IronPython.
decorator
A function returning another function, usually applied as a function
@@ -263,16 +264,25 @@ Glossary
See :term:`global interpreter lock`.
global interpreter lock
- The lock used by Python threads to assure that only one thread
- executes in the :term:`CPython` :term:`virtual machine` at a time.
- This simplifies the CPython implementation by assuring that no two
- processes can access the same memory at the same time. Locking the
- entire interpreter makes it easier for the interpreter to be
- multi-threaded, at the expense of much of the parallelism afforded by
- multi-processor machines. Efforts have been made in the past to
- create a "free-threaded" interpreter (one which locks shared data at a
- much finer granularity), but so far none have been successful because
- performance suffered in the common single-processor case.
+ The mechanism used by the :term:`CPython` interpreter to assure that
+ only one thread executes Python :term:`bytecode` at a time.
+ This simplifies the CPython implementation by making the object model
+ (including critical built-in types such as :class:`dict`) implicitly
+ safe against concurrent access. Locking the entire interpreter
+ makes it easier for the interpreter to be multi-threaded, at the
+ expense of much of the parallelism afforded by multi-processor
+ machines.
+
+ However, some extension modules, either standard or third-party,
+ are designed so as to release the GIL when doing computationally-intensive
+ tasks such as compression or hashing. Also, the GIL is always released
+ when doing I/O.
+
+ Past efforts to create a "free-threaded" interpreter (one which locks
+ shared data at a much finer granularity) have not been successful
+ because performance suffered in the common single-processor case. It
+ is believed that overcoming this performance issue would make the
+ implementation much more complicated and therefore costlier to maintain.
hashable
An object is *hashable* if it has a hash value which never changes during
diff --git a/Doc/library/threading.rst b/Doc/library/threading.rst
index 371ac90..74c9976 100644
--- a/Doc/library/threading.rst
+++ b/Doc/library/threading.rst
@@ -17,11 +17,23 @@ The :mod:`dummy_threading` module is provided for situations where
methods and functions in this module in the Python 2.x series are still
supported by this module.
+.. impl-detail::
+
+ Due to the :term:`Global Interpreter Lock`, in CPython only one thread
+ can execute Python code at once (even though certain performance-oriented
+ libraries might overcome this limitation).
+ If you want your application to make better of use of the computational
+ resources of multi-core machines, you are advised to use
+ :mod:`multiprocessing` or :class:`concurrent.futures.ProcessPoolExecutor`.
+ However, threading is still an appropriate model if you want to run
+ multiple I/O-bound tasks simultaneously.
+
.. seealso::
Latest version of the :source:`threading module Python source code
<Lib/threading.py>`
+
This module defines the following functions and objects: