summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorÉric Araujo <merwok@netwok.org>2011-06-19 17:23:48 (GMT)
committerÉric Araujo <merwok@netwok.org>2011-06-19 17:23:48 (GMT)
commite043b6bac7c2d2845e0b1d10d140364c73de8456 (patch)
tree57f5972e3239ebd80d1c4ab9fe39f41514a26f7d
parent348c572dcf1716f2cb2b5244b872ab9b186474da (diff)
downloadcpython-e043b6bac7c2d2845e0b1d10d140364c73de8456.zip
cpython-e043b6bac7c2d2845e0b1d10d140364c73de8456.tar.gz
cpython-e043b6bac7c2d2845e0b1d10d140364c73de8456.tar.bz2
Add missing documentation for packaging.pypi.base and .simple
-rw-r--r--Doc/library/packaging.pypi.rst21
-rw-r--r--Doc/library/packaging.pypi.simple.rst87
2 files changed, 95 insertions, 13 deletions
diff --git a/Doc/library/packaging.pypi.rst b/Doc/library/packaging.pypi.rst
index 93b61c9..14602ce 100644
--- a/Doc/library/packaging.pypi.rst
+++ b/Doc/library/packaging.pypi.rst
@@ -51,3 +51,24 @@ with a preference toward the latter.
.. method:: get_release
.. method:: get_releases
+
+
+:mod:`packaging.pypi.base` --- Base class for index crawlers
+============================================================
+
+.. module:: packaging.pypi.base
+ :synopsis: Base class used to implement crawlers.
+
+
+.. class:: BaseClient(prefer_final, prefer_source)
+
+ Base class containing common methods for the index crawlers or clients. One
+ method is currently defined:
+
+ .. method:: download_distribution(requirements, temp_path=None, \
+ prefer_source=None, prefer_final=None)
+
+ Download a distribution from the last release according to the
+ requirements. If *temp_path* is provided, download to this path,
+ otherwise, create a temporary directory for the download. If a release is
+ found, the full path to the downloaded file is returned.
diff --git a/Doc/library/packaging.pypi.simple.rst b/Doc/library/packaging.pypi.simple.rst
index ea5edca..92b3270 100644
--- a/Doc/library/packaging.pypi.simple.rst
+++ b/Doc/library/packaging.pypi.simple.rst
@@ -6,26 +6,87 @@
and distributions.
-`packaging.pypi.simple` can process Python Package Indexes and provides
-useful information about distributions. It also can crawl local indexes, for
-instance.
+The class provided by :mod:`packaging.pypi.simple` can access project indexes
+and provide useful information about distributions. PyPI, other indexes and
+local indexes are supported.
-You should use `packaging.pypi.simple` for:
+You should use this module to search distributions by name and versions, process
+index external pages and download distributions. It is not suited for things
+that will end up in too long index processing (like "finding all distributions
+with a specific version, no matter the name"); use :mod:`packaging.pypi.xmlrpc`
+for that.
- * Search distributions by name and versions.
- * Process index external pages.
- * Download distributions by name and versions.
-And should not be used for:
+API
+---
- * Things that will end up in too long index processing (like "finding all
- distributions with a specific version, no matters the name")
+.. class:: Crawler(index_url=DEFAULT_SIMPLE_INDEX_URL, \
+ prefer_final=False, prefer_source=True, \
+ hosts=('*',), follow_externals=False, \
+ mirrors_url=None, mirrors=None, timeout=15, \
+ mirrors_max_tries=0, verbose=False)
+ *index_url* is the address of the index to use for requests.
+
+ The first two parameters control the query results. *prefer_final*
+ indicates whether a final version (not alpha, beta or candidate) is to be
+ prefered over a newer but non-final version (for example, whether to pick
+ up 1.0 over 2.0a3). It is used only for queries that don't give a version
+ argument. Likewise, *prefer_source* tells whether to prefer a source
+ distribution over a binary one, if no distribution argument was prodived.
+
+ Other parameters are related to external links (that is links that go
+ outside the simple index): *hosts* is a list of hosts allowed to be
+ processed if *follow_externals* is true (default behavior is to follow all
+ hosts), *follow_externals* enables or disables following external links
+ (default is false, meaning disabled).
+
+ The remaining parameters are related to the mirroring infrastructure
+ defined in :PEP:`381`. *mirrors_url* gives a URL to look on for DNS
+ records giving mirror adresses; *mirrors* is a list of mirror URLs (see
+ the PEP). If both *mirrors* and *mirrors_url* are given, *mirrors_url*
+ will only be used if *mirrors* is set to ``None``. *timeout* is the time
+ (in seconds) to wait before considering a URL has timed out;
+ *mirrors_max_tries"* is the number of times to try requesting informations
+ on mirrors before switching.
+
+ The following methods are defined:
+
+ .. method:: get_distributions(project_name, version)
+
+ Return the distributions found in the index for the given release.
+
+ .. method:: get_metadata(project_name, version)
+
+ Return the metadata found on the index for this project name and
+ version. Currently downloads and unpacks a distribution to read the
+ PKG-INFO file.
+
+ .. method:: get_release(requirements, prefer_final=None)
+
+ Return one release that fulfills the given requirements.
+
+ .. method:: get_releases(requirements, prefer_final=None, force_update=False)
+
+ Search for releases and return a
+ :class:`~packaging.pypi.dist.ReleasesList` object containing the
+ results.
+
+ .. method:: search_projects(name=None)
+
+ Search the index for projects containing the given name and return a
+ list of matching names.
+
+ See also the base class :class:`packaging.pypi.base.BaseClient` for inherited
+ methods.
+
+
+.. data:: DEFAULT_SIMPLE_INDEX_URL
+
+ The address used by default by the crawler class. It is currently
+ ``'http://a.pypi.python.org/simple/'``, the main PyPI installation.
-API
----
-.. class:: Crawler
Usage Exemples