Skip to content

Commit

Permalink
fix typos
Browse files Browse the repository at this point in the history
  • Loading branch information
noviluni committed Dec 18, 2019
1 parent 916382e commit c0d84f0
Show file tree
Hide file tree
Showing 10 changed files with 44 additions and 45 deletions.
2 changes: 1 addition & 1 deletion docs/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@ the tests with Python 3.6 use::

tox -e py36

You can also specify a comma-separated list of environmets, and use :ref:`tox’s
You can also specify a comma-separated list of environments, and use :ref:`tox’s
parallel mode <tox:parallel_mode>` to run the tests on multiple environments in
parallel::

Expand Down
2 changes: 1 addition & 1 deletion docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -338,7 +338,7 @@ How to split an item into multiple items in an item pipeline?
input item. :ref:`Create a spider middleware <custom-spider-middleware>`
instead, and use its
:meth:`~scrapy.spidermiddlewares.SpiderMiddleware.process_spider_output`
method for this puspose. For example::
method for this purpose. For example::

from copy import deepcopy

Expand Down
69 changes: 34 additions & 35 deletions docs/news.rst

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/topics/jobs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ Request serialization
For persistence to work, :class:`~scrapy.http.Request` objects must be
serializable with :mod:`pickle`, except for the ``callback`` and ``errback``
values passed to their ``__init__`` method, which must be methods of the
runnning :class:`~scrapy.spiders.Spider` class.
running :class:`~scrapy.spiders.Spider` class.

If you wish to log the requests that couldn't be serialized, you can set the
:setting:`SCHEDULER_DEBUG` setting to ``True`` in the project's settings page.
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/leaks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ ties the response lifetime to the requests' one, and that would definitely
cause memory leaks.

Let's see how we can discover the cause (without knowing it
a-priori, of course) by using the ``trackref`` tool.
a priori, of course) by using the ``trackref`` tool.

After the crawler is running for a few minutes and we notice its memory usage
has grown a lot, we can enter its telnet console and check the live
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/media-pipeline.rst
Original file line number Diff line number Diff line change
Expand Up @@ -558,7 +558,7 @@ See here the methods that you can override in your custom Images Pipeline:
Custom Images pipeline example
==============================

Here is a full example of the Images Pipeline whose methods are examplified
Here is a full example of the Images Pipeline whose methods are exemplified
above::

import scrapy
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/settings.rst
Original file line number Diff line number Diff line change
Expand Up @@ -884,7 +884,7 @@ LOG_FORMAT

Default: ``'%(asctime)s [%(name)s] %(levelname)s: %(message)s'``

String for formatting log messsages. Refer to the `Python logging documentation`_ for the whole list of available
String for formatting log messages. Refer to the `Python logging documentation`_ for the whole list of available
placeholders.

.. _Python logging documentation: https://docs.python.org/2/library/logging.html#logrecord-attributes
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/telnetconsole.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ autogenerated Password can be seen on scrapy logs like the example below::

2018-10-16 14:35:21 [scrapy.extensions.telnet] INFO: Telnet Password: 16f92501e8a59326

Default Username and Password can be overriden by the settings
Default Username and Password can be overridden by the settings
:setting:`TELNETCONSOLE_USERNAME` and :setting:`TELNETCONSOLE_PASSWORD`.

.. warning::
Expand Down
4 changes: 2 additions & 2 deletions sep/sep-001.rst
Original file line number Diff line number Diff line change
Expand Up @@ -254,8 +254,8 @@ ItemForm

#!python
class MySiteForm(ItemForm):
witdth = adaptor(ItemForm.witdh, default_unit='cm')
volume = adaptor(ItemForm.witdh, default_unit='lt')
width = adaptor(ItemForm.width, default_unit='cm')
volume = adaptor(ItemForm.width, default_unit='lt')

ia['width'] = x.x('//p[@class="width"]')
ia['volume'] = x.x('//p[@class="volume"]')
Expand Down
2 changes: 1 addition & 1 deletion sep/sep-019.rst
Original file line number Diff line number Diff line change
Expand Up @@ -185,7 +185,7 @@ These ideas translate to the following changes on the ``SpiderManager`` class:
will return a spider class, not an instance. It's basically a ``__get__``
to ``self._spiders``.

- All remaining functions should be deprecated or remove accordantly, since a
- All remaining functions should be deprecated or remove accordingly, since a
crawler reference is no longer needed.

- New helper ``get_spider_manager_class_from_scrapycfg`` in
Expand Down

0 comments on commit c0d84f0

Please sign in to comment.