Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test App Icon failure in testbed #3021

Open
HalfWhitt opened this issue Dec 5, 2024 · 3 comments
Open

Test App Icon failure in testbed #3021

HalfWhitt opened this issue Dec 5, 2024 · 3 comments
Labels
bug A crash or error in behavior.

Comments

@HalfWhitt
Copy link
Contributor

Describe the bug

I keep forgetting to post this, but it's been happening consistently for a while on my computer, despite the fact I've never seen it pop up in CI. test_app_icon always fails, apparently because the pixel being tested is off by 1 in the green band. It's (149, 118, 75, 255), while one of the acceptable values is (149, 119, 73, 255).

Steps to reproduce

briefcase dev --test -- tests/app/test_app.py::test_app_icon
(Same with briefcase run)

Expected behavior

I'd expect it to pass, like in CI.

Screenshots

No response

Environment

  • Operating System: macOS 14.6.1
  • Python version: 3.13
  • Software versions:
    • Toga: main branch, as well as 0.4.8

Logs

============================= test session starts ==============================
collecting ... collected 1 item

tests/app/test_app.py::test_app_icon FAILED                              [100%]

=================================== FAILURES ===================================
________________________________ test_app_icon _________________________________
Traceback (most recent call last):
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/runner.py", line 341, in from_call
    result: TResult | None = func()
                             ~~~~^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/runner.py", line 242, in <lambda>
    lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise
            ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_hooks.py", line 513, in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
           ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_manager.py", line 120, in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
           ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 139, in _multicall
    raise exception.with_traceback(exception.__traceback__)
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 122, in _multicall
    teardown.throw(exception)  # type: ignore[union-attr]
    ~~~~~~~~~~~~~~^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/threadexception.py", line 92, in pytest_runtest_call
    yield from thread_exception_runtest_hook()
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/threadexception.py", line 68, in thread_exception_runtest_hook
    yield
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 122, in _multicall
    teardown.throw(exception)  # type: ignore[union-attr]
    ~~~~~~~~~~~~~~^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/unraisableexception.py", line 95, in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/unraisableexception.py", line 70, in unraisable_exception_runtest_hook
    yield
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 122, in _multicall
    teardown.throw(exception)  # type: ignore[union-attr]
    ~~~~~~~~~~~~~~^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/logging.py", line 846, in pytest_runtest_call
    yield from self._runtest_for(item, "call")
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/logging.py", line 829, in _runtest_for
    yield
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 122, in _multicall
    teardown.throw(exception)  # type: ignore[union-attr]
    ~~~~~~~~~~~~~~^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/capture.py", line 880, in pytest_runtest_call
    return (yield)
            ^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 122, in _multicall
    teardown.throw(exception)  # type: ignore[union-attr]
    ~~~~~~~~~~~~~~^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/skipping.py", line 257, in pytest_runtest_call
    return (yield)
            ^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 103, in _multicall
    res = hook_impl.function(*args)
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/runner.py", line 174, in pytest_runtest_call
    item.runtest()
    ~~~~~~~~~~~~^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pytest_asyncio/plugin.py", line 457, in runtest
    super().runtest()
    ~~~~~~~~~~~~~~~^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/python.py", line 1627, in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_hooks.py", line 513, in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
           ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_manager.py", line 120, in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
           ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 182, in _multicall
    return outcome.get_result()
           ~~~~~~~~~~~~~~~~~~^^
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_result.py", line 100, in get_result
    raise exc.with_traceback(exc.__traceback__)
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pluggy/_callers.py", line 103, in _multicall
    res = hook_impl.function(*args)
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/_pytest/python.py", line 159, in pytest_pyfunc_call
    result = testfunction(**testargs)
  File "/Users/charles/.pyenv/versions/3.13.0/envs/toga_13/lib/python3.13/site-packages/pytest_asyncio/plugin.py", line 929, in inner
    _loop.run_until_complete(task)
    ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
  File "/Users/charles/toga_dev/toga/testbed/tests/conftest.py", line 142, in run_until_complete
    return asyncio.run_coroutine_threadsafe(coro, self.loop).result()
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/Users/charles/.pyenv/versions/3.13.0/lib/python3.13/concurrent/futures/_base.py", line 456, in result
    return self.__get_result()
           ~~~~~~~~~~~~~~~~~^^
  File "/Users/charles/.pyenv/versions/3.13.0/lib/python3.13/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/charles/toga_dev/toga/testbed/tests/app/test_app.py", line 245, in test_app_icon
    app_probe.assert_app_icon(None)
    ~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
  File "/Users/charles/toga_dev/toga/cocoa/tests_backend/app.py", line 95, in assert_app_icon
    assert mid_color in {
    ...<5 lines>...
    }
AssertionError: assert (149, 118, 75, 255) in {(130, 100, 57, 255), (130, 109, 66, 255), (138, 107, 64, 255), (138, 108, 64, 255), (149, 119, 73, 255)}
=========================== short test summary info ============================
FAILED tests/app/test_app.py::test_app_icon - assert (149, 118, 75, 255) in {(130, 100, 57, 255), (130, 109, 66, 255), (138, 107, 64, 255), (138, 108, 64, 255), (149, 119, 73, 255)}
============================== 1 failed in 0.27s ===============================

[testbed] Test suite failed!

Additional context

I see this comment in the test:

# Due to icon resizing and colorspace issues, the exact pixel colors are
# inconsistent, so multiple values must be provided for test purposes.

Perhaps we could test within an acceptable range for each band, rather than expecting one of the discrete sets of values listed... It's odd that each band only varies by a little bit in the acceptable values for each of the three tests, except for the explicit icon's mid color's red value, which ranges from 0 to 105.

(Speaking of testing with 0.4.8, would there be any reason why the 0.4.8 tag is missing from my repo, even after a fetch --all? The latest one I have is 0.4.7, but I can manually check out commit b987e88. And I can't fetch any tags for Travertino at all.)

@HalfWhitt HalfWhitt added the bug A crash or error in behavior. label Dec 5, 2024
@freakboy3742
Copy link
Member

Perhaps we could test within an acceptable range for each band, rather than expecting one of the discrete sets of values listed... It's odd that each band only varies by a little bit in the acceptable values for each of the three tests, except for the explicit icon's mid color's red value, which ranges from 0 to 105.

I suspect this may have something to do with sizing - the test technique is based upon pixel picking, but if the icon is a different size, then the pixel being picked might end up touching a border (or something similar) that has a radically different red value.

If we can't find a reliable fix for picking the same pixel every time (or pick a location that isn't subject to this sort of problem), a "soft" match sounds like a plausible workaround.

(Speaking of testing with 0.4.8, would there be any reason why the 0.4.8 tag is missing from my repo, even after a fetch --all? The latest one I have is 0.4.7, but I can manually check out commit b987e88. And I can't fetch any tags for Travertino at all.)

As long as you've got the upstream repo as a remote, I can't think of any reason why this would be happening.

@mhsmith
Copy link
Member

mhsmith commented Dec 5, 2024

would there be any reason why the 0.4.8 tag is missing from my repo, even after a fetch --all? The latest one I have is 0.4.7, but I can manually check out commit b987e88. And I can't fetch any tags for Travertino at all.

This has always been a bit mysterious to me as well, but you might need to run git fetch --tags.

@HalfWhitt
Copy link
Contributor Author

would there be any reason why the 0.4.8 tag is missing from my repo, even after a fetch --all? The latest one I have is 0.4.7, but I can manually check out commit b987e88. And I can't fetch any tags for Travertino at all.

This has always been a bit mysterious to me as well, but you might need to run git fetch --tags.

I've tried that too. Ah well, it's not particularly a problem, just mysterious and mildly inconvenient.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug A crash or error in behavior.
Projects
None yet
Development

No branches or pull requests

3 participants