Test #1219
test.yml
on: schedule
unit-test
1h 58m
docker-build
57s
Annotations
2 errors and 2 warnings
unit-test:
tests/unit_tests/test_matchmaker_algorithm_stable_marriage.py#L153
test_stable_marriage_produces_symmetric_matchings[build_fast]
hypothesis.errors.Flaky: Hypothesis test_stable_marriage_produces_symmetric_matchings(request=<FixtureRequest for <Function test_stable_marriage_produces_symmetric_matchings[build_fast]>>, caplog_context=make_caplog_context, build_func=build_fast, searches=[Search(['p0', 'p1'], -229.516143641621, has_newbie),
Search(['p0', 'p1'], -485.35645313420395),
Search(['p0', 'p1'], -2.249985),
Search(['p0', 'p1'], -751.4999544824218),
Search(['p0', 'p1'], -4.000005),
Search(['p0', 'p1'], -1159.044523880034),
Search(['p0', 'p1'], 288.92390172796877),
Search(['p0', 'p1'], 657.6691446701983),
Search(['p0', 'p1'], 612.5535741268842),
Search(['p0', 'p1'], -304.8386010970557),
Search(['p0', 'p1'], -85.69059528864626),
Search(['p0', 'p1'], 657.6691446701983),
Search(['p0', 'p1'], -4.349985)]) produces unreliable results: Falsified on the first call but did not on a subsequent one
Falsifying example: test_stable_marriage_produces_symmetric_matchings(
request=<FixtureRequest for <Function test_stable_marriage_produces_symmetric_matchings[build_fast]>>,
caplog_context=make_caplog_context,
build_func=build_fast,
searches=[Search(['p0', 'p1'], -229.516143641621, has_newbie),
Search(['p0', 'p1'], -485.35645313420395),
Search(['p0', 'p1'], -2.249985),
Search(['p0', 'p1'], -751.4999544824218),
Search(['p0', 'p1'], -4.000005),
Search(['p0', 'p1'], -1159.044523880034),
Search(['p0', 'p1'], 288.92390172796877),
Search(['p0', 'p1'], 657.6691446701983),
Search(['p0', 'p1'], 612.5535741268842),
Search(['p0', 'p1'], -304.8386010970557),
Search(['p0', 'p1'], -85.69059528864626),
Search(['p0', 'p1'], 657.6691446701983),
Search(['p0', 'p1'], -4.349985)],
)
Unreliable test timings! On an initial run, this test took 389.55ms, which exceeded the deadline of 300.00ms, but on a subsequent run it took 7.79 ms, which did not. If you expect this sort of variability in your test timings, consider turning deadlines off for this test by setting deadline=None.
You can reproduce this example by temporarily adding @reproduce_failure('6.98.11', b'AXicrVNbS1RhFN17f5fTjDpSZkKDIzmhYVaWY5RdNLrpmJQVEjJgYChKPYg+FIWdBlHp5g1CI5oHESJfIl9KMcYbUqFIWIT60mUixYKoByPk9E0z59QPaD0cDufsb629114f0jaJEEEFJ9DXvs/pCQEcgjQULqJSROifhBTC4HSkCjVNIxhvPpw51f9mL+gXbt101azkNwAi5QDaGJiVCj83VPqM4Tu9oDfm9foNT4GHAcqLxMsIbX0FQBz0KK2UEIy5Hdoad9XYyBqReoAnEBgfI/1IjVu/bawHcc271zHNuSR9jHbujwWcXe4rvZHt7RAIwaj4AZxF1PLLiFcBuQWOZkW+q4FNqgz0oDrxT9OxEvTWPCMMJyYgv0f0GGEQhJOhaY5glsYJ9gF5C7FchmIMMiUan02bhDVaPe5GkcKI9jgIJqJCcTJKgT84q1XlYqGbZB2DRAFGqDuMLjtAYPnp0tAD9xPQr8HBvtaR4nQ2oVYgkpab1g2Oa9V+kjOmG4BcbW7xuNO1/dwx0B/titvS23Y3hplC30+xcpT3IQPRCFW8Wmr3g+DWKADnUQZBZGeZB2yjDlJ2iEgzJ2EY5TA4ueWDCs4378qEt+ltJ+gtun1gbHVnMtlR5jEG7ArxOsB9O7jJNqYegaF2T7Pv5Qro1xsWpp9NBtIxFTH2tAP5V+IDsErtZv3DPwhnIyqdxOZQzBP5GcoXgE5NWD0onMn8VOKtOzwP+iV3oKxjsDYdviCfI7EJeSFkMGtYuxaeJvJezUaQHQVK5miGQVGlursuHzlblAi6r3CqeJHV12A50i8UbQz5ErgQze7i44ViM73LZjlIaYz+r3ncoVJv5kX+vQBOKkFWFE4c8hnYHN5mRIWpCywrw6gqgBDSc4DfRqoAOA==') as a decorator on your test case
|
unit-test
Process completed with exit code 1.
|
docker-build
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v3. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
|
unit-test
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v3, actions/cache@v3, actions/setup-python@v4. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
|