Skip to content

Commit

Permalink
Merge branch 'main' into feature/import-cache-2
Browse files Browse the repository at this point in the history
  • Loading branch information
noahbkim authored May 20, 2024
2 parents 717a514 + 034cf0c commit b401a6e
Show file tree
Hide file tree
Showing 183 changed files with 2,561 additions and 1,304 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -486,6 +486,7 @@ jobs:
config_hash: ${{ needs.check_source.outputs.config_hash }}
options: ./configure --config-cache --with-thread-sanitizer --with-pydebug
suppressions_path: Tools/tsan/supressions.txt
tsan_logs_artifact_name: tsan-logs-default

build_tsan_free_threading:
name: 'Thread sanitizer (free-threading)'
Expand All @@ -496,6 +497,7 @@ jobs:
config_hash: ${{ needs.check_source.outputs.config_hash }}
options: ./configure --config-cache --disable-gil --with-thread-sanitizer --with-pydebug
suppressions_path: Tools/tsan/suppressions_free_threading.txt
tsan_logs_artifact_name: tsan-logs-free-threading

# CIFuzz job based on https://google.github.io/oss-fuzz/getting-started/continuous-integration/
cifuzz:
Expand Down
6 changes: 6 additions & 0 deletions .github/workflows/jit.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,17 @@ on:
- '**jit**'
- 'Python/bytecodes.c'
- 'Python/optimizer*.c'
- '!Python/perf_jit_trampoline.c'
- '!**/*.md'
- '!**/*.ini'
push:
paths:
- '**jit**'
- 'Python/bytecodes.c'
- 'Python/optimizer*.c'
- '!Python/perf_jit_trampoline.c'
- '!**/*.md'
- '!**/*.ini'
workflow_dispatch:

permissions:
Expand Down
3 changes: 2 additions & 1 deletion .github/workflows/reusable-docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,8 @@ jobs:
python Doc/tools/check-warnings.py \
--annotate-diff '${{ env.branch_base }}' '${{ env.branch_pr }}' \
--fail-if-regression \
--fail-if-improved
--fail-if-improved \
--fail-if-new-news-nit
# This build doesn't use problem matchers or check annotations
build_doc_oldest_supported_sphinx:
Expand Down
16 changes: 15 additions & 1 deletion .github/workflows/reusable-tsan.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,10 @@ on:
description: 'A repo relative path to the suppressions file'
required: true
type: string
tsan_logs_artifact_name:
description: 'Name of the TSAN logs artifact. Must be unique for each job.'
required: true
type: string

jobs:
build_tsan_reusable:
Expand Down Expand Up @@ -41,7 +45,7 @@ jobs:
sudo sysctl -w vm.mmap_rnd_bits=28
- name: TSAN Option Setup
run: |
echo "TSAN_OPTIONS=suppressions=${GITHUB_WORKSPACE}/${{ inputs.suppressions_path }}" >> $GITHUB_ENV
echo "TSAN_OPTIONS=log_path=${GITHUB_WORKSPACE}/tsan_log suppressions=${GITHUB_WORKSPACE}/${{ inputs.suppressions_path }} handle_segv=0" >> $GITHUB_ENV
echo "CC=clang" >> $GITHUB_ENV
echo "CXX=clang++" >> $GITHUB_ENV
- name: Add ccache to PATH
Expand All @@ -60,3 +64,13 @@ jobs:
run: make pythoninfo
- name: Tests
run: ./python -m test --tsan -j4
- name: Display TSAN logs
if: always()
run: find ${GITHUB_WORKSPACE} -name 'tsan_log.*' | xargs head -n 1000
- name: Archive TSAN logs
if: always()
uses: actions/upload-artifact@v4
with:
name: ${{ inputs.tsan_logs_artifact_name }}
path: tsan_log.*
if-no-files-found: ignore
6 changes: 6 additions & 0 deletions Doc/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ help:
@echo " clean to remove build files"
@echo " venv to create a venv with necessary tools"
@echo " html to make standalone HTML files"
@echo " gettext to generate POT files"
@echo " htmlview to open the index page built by the html target in your browser"
@echo " htmllive to rebuild and reload HTML files in your browser"
@echo " htmlhelp to make HTML files and a HTML help project"
Expand Down Expand Up @@ -140,6 +141,11 @@ pydoc-topics: build
@echo "Building finished; now run this:" \
"cp build/pydoc-topics/topics.py ../Lib/pydoc_data/topics.py"

.PHONY: gettext
gettext: BUILDER = gettext
gettext: SPHINXOPTS += '-d build/doctrees-gettext'
gettext: build

.PHONY: htmlview
htmlview: html
$(PYTHON) -c "import os, webbrowser; webbrowser.open('file://' + os.path.realpath('build/html/index.html'))"
Expand Down
1 change: 1 addition & 0 deletions Doc/c-api/dict.rst
Original file line number Diff line number Diff line change
Expand Up @@ -191,6 +191,7 @@ Dictionary Objects
to both *default_value* and *\*result* (if it's not ``NULL``).
These may refer to the same object: in that case you hold two separate
references to it.
.. versionadded:: 3.13
Expand Down
2 changes: 2 additions & 0 deletions Doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -374,6 +374,8 @@
# Split the index
html_split_index = True

# Split pot files one per reST file
gettext_compact = False

# Options for LaTeX output
# ------------------------
Expand Down
3 changes: 2 additions & 1 deletion Doc/library/dataclasses.rst
Original file line number Diff line number Diff line change
Expand Up @@ -615,7 +615,8 @@ methods will raise a :exc:`FrozenInstanceError` when invoked.

There is a tiny performance penalty when using ``frozen=True``:
:meth:`~object.__init__` cannot use simple assignment to initialize fields, and
must use :meth:`!__setattr__`.
must use :meth:`!object.__setattr__`.
.. Make sure to not remove "object" from "object.__setattr__" in the above markup
.. _dataclasses-inheritance:

Expand Down
10 changes: 7 additions & 3 deletions Doc/library/functions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -608,9 +608,13 @@ are always available. They are listed here in alphabetical order.
will be used for both the global and the local variables. If *globals* and
*locals* are given, they are used for the global and local variables,
respectively. If provided, *locals* can be any mapping object. Remember
that at the module level, globals and locals are the same dictionary. If exec
gets two separate objects as *globals* and *locals*, the code will be
executed as if it were embedded in a class definition.
that at the module level, globals and locals are the same dictionary.

.. note::

Most users should just pass a *globals* argument and never *locals*.
If exec gets two separate objects as *globals* and *locals*, the code
will be executed as if it were embedded in a class definition.

If the *globals* dictionary does not contain a value for the key
``__builtins__``, a reference to the dictionary of the built-in module
Expand Down
8 changes: 6 additions & 2 deletions Doc/library/functools.rst
Original file line number Diff line number Diff line change
Expand Up @@ -646,8 +646,9 @@ The :mod:`functools` module defines the following functions:
attributes of the wrapper function are updated with the corresponding attributes
from the original function. The default values for these arguments are the
module level constants ``WRAPPER_ASSIGNMENTS`` (which assigns to the wrapper
function's ``__module__``, ``__name__``, ``__qualname__``, ``__annotations__``
and ``__doc__``, the documentation string) and ``WRAPPER_UPDATES`` (which
function's ``__module__``, ``__name__``, ``__qualname__``, ``__annotations__``,
``__type_params__``, and ``__doc__``, the documentation string)
and ``WRAPPER_UPDATES`` (which
updates the wrapper function's ``__dict__``, i.e. the instance dictionary).

To allow access to the original function for introspection and other purposes
Expand Down Expand Up @@ -677,6 +678,9 @@ The :mod:`functools` module defines the following functions:
function, even if that function defined a ``__wrapped__`` attribute.
(see :issue:`17482`)

.. versionchanged:: 3.12
The ``__type_params__`` attribute is now copied by default.


.. decorator:: wraps(wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES)

Expand Down
2 changes: 1 addition & 1 deletion Doc/library/importlib.metadata.rst
Original file line number Diff line number Diff line change
Expand Up @@ -343,7 +343,7 @@ instance::
>>> dist.metadata['License'] # doctest: +SKIP
'MIT'

For editable packages, an origin property may present :pep:`610`
For editable packages, an ``origin`` property may present :pep:`610`
metadata::

>>> dist.origin.url
Expand Down
70 changes: 31 additions & 39 deletions Doc/library/itertools.rst
Original file line number Diff line number Diff line change
Expand Up @@ -122,15 +122,15 @@ loops that truncate the stream.
# accumulate([1,2,3,4,5]) → 1 3 6 10 15
# accumulate([1,2,3,4,5], initial=100) → 100 101 103 106 110 115
# accumulate([1,2,3,4,5], operator.mul) → 1 2 6 24 120
it = iter(iterable)
iterator = iter(iterable)
total = initial
if initial is None:
try:
total = next(it)
total = next(iterator)
except StopIteration:
return
yield total
for element in it:
for element in iterator:
total = func(total, element)
yield total

Expand Down Expand Up @@ -218,9 +218,8 @@ loops that truncate the stream.

def chain(*iterables):
# chain('ABC', 'DEF') → A B C D E F
for it in iterables:
for element in it:
yield element
for iterable in iterables:
yield from iterable


.. classmethod:: chain.from_iterable(iterable)
Expand All @@ -230,9 +229,8 @@ loops that truncate the stream.

def from_iterable(iterables):
# chain.from_iterable(['ABC', 'DEF']) → A B C D E F
for it in iterables:
for element in it:
yield element
for iterable in iterables:
yield from iterable


.. function:: combinations(iterable, r)
Expand Down Expand Up @@ -380,7 +378,7 @@ loops that truncate the stream.
saved.append(element)
while saved:
for element in saved:
yield element
yield element

Note, this member of the toolkit may require significant auxiliary storage
(depending on the length of the iterable).
Expand Down Expand Up @@ -615,10 +613,10 @@ loops that truncate the stream.
This function is roughly equivalent to the following code, except that the
actual implementation does not build up intermediate results in memory::

def product(*args, repeat=1):
def product(*iterables, repeat=1):
# product('ABCD', 'xy') → Ax Ay Bx By Cx Cy Dx Dy
# product(range(2), repeat=3) → 000 001 010 011 100 101 110 111
pools = [tuple(pool) for pool in args] * repeat
pools = [tuple(pool) for pool in iterables] * repeat
result = [[]]
for pool in pools:
result = [x+[y] for x in result for y in pool]
Expand Down Expand Up @@ -696,24 +694,23 @@ loops that truncate the stream.

Return *n* independent iterators from a single iterable.

The following Python code helps explain what *tee* does (although the actual
implementation is more complex and uses only a single underlying
:abbr:`FIFO (first-in, first-out)` queue)::
Roughly equivalent to::

def tee(iterable, n=2):
it = iter(iterable)
deques = [collections.deque() for i in range(n)]
def gen(mydeque):
iterator = iter(iterable)
shared_link = [None, None]
return tuple(_tee(iterator, shared_link) for _ in range(n))

def _tee(iterator, link):
try:
while True:
if not mydeque: # when the local deque is empty
try:
newval = next(it) # fetch a new value and
except StopIteration:
return
for d in deques: # load it to all the deques
d.append(newval)
yield mydeque.popleft()
return tuple(gen(d) for d in deques)
if link[1] is None:
link[0] = next(iterator)
link[1] = [None, None]
value, link = link
yield value
except StopIteration:
return

Once a :func:`tee` has been created, the original *iterable* should not be
used anywhere else; otherwise, the *iterable* could get advanced without
Expand All @@ -735,17 +732,17 @@ loops that truncate the stream.
iterables are of uneven length, missing values are filled-in with *fillvalue*.
Iteration continues until the longest iterable is exhausted. Roughly equivalent to::

def zip_longest(*args, fillvalue=None):
def zip_longest(*iterables, fillvalue=None):
# zip_longest('ABCD', 'xy', fillvalue='-') → Ax By C- D-
iterators = [iter(it) for it in args]
iterators = [iter(it) for it in iterables]
num_active = len(iterators)
if not num_active:
return
while True:
values = []
for i, it in enumerate(iterators):
for i, iterator in enumerate(iterators):
try:
value = next(it)
value = next(iterator)
except StopIteration:
num_active -= 1
if not num_active:
Expand Down Expand Up @@ -800,6 +797,7 @@ and :term:`generators <generator>` which incur interpreter overhead.
.. testcode::

import collections
import contextlib
import functools
import math
import operator
Expand Down Expand Up @@ -942,32 +940,26 @@ and :term:`generators <generator>` which incur interpreter overhead.
# iter_index('AABCADEAF', 'A') → 0 1 4 7
seq_index = getattr(iterable, 'index', None)
if seq_index is None:
# Path for general iterables
iterator = islice(iterable, start, stop)
for i, element in enumerate(iterator, start):
if element is value or element == value:
yield i
else:
# Path for sequences with an index() method
stop = len(iterable) if stop is None else stop
i = start
try:
with contextlib.suppress(ValueError):
while True:
yield (i := seq_index(value, i, stop))
i += 1
except ValueError:
pass

def iter_except(func, exception, first=None):
"Convert a call-until-exception interface to an iterator interface."
# iter_except(d.popitem, KeyError) → non-blocking dictionary iterator
try:
with contextlib.suppress(exception):
if first is not None:
yield first()
while True:
yield func()
except exception:
pass


The following recipes have a more mathematical flavor:
Expand Down
2 changes: 1 addition & 1 deletion Doc/library/marshal.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
This module contains functions that can read and write Python values in a binary
format. The format is specific to Python, but independent of machine
architecture issues (e.g., you can write a Python value to a file on a PC,
transport the file to a Sun, and read it back there). Details of the format are
transport the file to a Mac, and read it back there). Details of the format are
undocumented on purpose; it may change between Python versions (although it
rarely does). [#]_

Expand Down
Loading

0 comments on commit b401a6e

Please sign in to comment.