Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

single run feature #99

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Conversation

danlavu
Copy link

@danlavu danlavu commented Feb 17, 2025

No description provided.

@danlavu danlavu added enhancement New feature or request help wanted Extra attention is needed labels Feb 17, 2025
@danlavu danlavu marked this pull request as draft February 17, 2025 16:20
@danlavu danlavu force-pushed the single-run branch 5 times, most recently from 896b279 to 8886b5d Compare February 19, 2025 23:00
@danlavu
Copy link
Author

danlavu commented Feb 20, 2025

I don't like the logging output; it prints 'SKIPPED' and 'Skipped' for one test case; I'm not sure where it's coming from.

PASSED [ 31%]SKIPPED [ 37%]Single run marker set, skipping ipa

Skipped

The following marker values are tested, logs attached.

@pytest.mark.importance("critical")
@pytest.mark.topology(KnownTopologyGroup.AnyProvider)
def test_identity__single_run_no_marker(client: Client, provider: GenericProvider):


@pytest.mark.single_run()
@pytest.mark.importance("critical")
@pytest.mark.topology(KnownTopologyGroup.AnyProvider)
def test_identity__single_run_no_value(client: Client, provider: GenericProvider):


@pytest.mark.single_run("ad")
@pytest.mark.importance("critical")
@pytest.mark.topology(KnownTopologyGroup.AnyProvider)
def test_identity__single_run_value_specified(client: Client, provider: GenericProvider):


@pytest.mark.single_run("ad")
@pytest.mark.single_run("samba")
@pytest.mark.importance("critical")
@pytest.mark.topology(KnownTopologyGroup.AnyProvider)
def test_identity__single_run_invalid_values_specified(client: Client, provider: GenericProvider):

pytest-single-run.txt

@danlavu danlavu removed the help wanted Extra attention is needed label Feb 20, 2025
@danlavu danlavu marked this pull request as ready for review February 20, 2025 13:06
GenericProvider tests can be set to run once as a default.
Copy link
Contributor

@pbrezina pbrezina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not a big fun of single_run. How about preferred_topology? Its still not perfect, so suggestion is welcomed. You should also not take the string but the topology marker object instead. It would be also good to allow lists. That is:

# By default, runs only LDAP
@pytest.mark.preferred_topology(KnownTopology.LDAP)
@pytest.mark.topology(KnownTopologyGroup.AnyProvider)
def test_exampe(...):
    pass

# By default, runs only LDAP and IPA
@pytest.mark.preferred_topology(KnownTopology.LDAP)
@pytest.mark.preferred_topology(KnownTopology.IPA)
@pytest.mark.topology(KnownTopologyGroup.AnyProvider)
def test_exampe(...):
    pass

# By default, runs only LDAP and IPA
@pytest.mark.preferred_topology([KnownTopology.LDAP, KnownTopology.IPA])
@pytest.mark.topology(KnownTopologyGroup.AnyProvider)
def test_exampe(...):
    pass

Comment on lines +404 to +408
for y in item.iter_markers("topology"):
if len(y.args) > 0 and y.args[0].name == "AnyProvider":
self.single_run_topology = "ldap"
if len(y.args) > 0 and y.args[0].name == "AnyADProvider":
self.single_run_topology = "samba"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is SSSD specific, can't be here. Please, remove it.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ack.

Comment on lines +389 to +412
# Single run topology will default to LDAP or Samba.
if self.mh_ignore_single_run:
return

if len([i for i in item.iter_markers("single_run")]) > 1:
raise ValueError("Too many arguments for @pytest.mark.single_run.")

i = next(item.iter_markers("single_run"), "")
if not i:
return

for i in item.iter_markers("single_run"):
if len(i.args) > 0:
self.single_run_topology = i.args[0]
else:
for y in item.iter_markers("topology"):
if len(y.args) > 0 and y.args[0].name == "AnyProvider":
self.single_run_topology = "ldap"
if len(y.args) > 0 and y.args[0].name == "AnyADProvider":
self.single_run_topology = "samba"

if self.single_run_topology is not self.current_topology:
self.logger.info(f"Single run marker set, skipping {self.current_topology}")
pytest.skip()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move this into a standalone method.

@pbrezina
Copy link
Contributor

It should probably rely on data.topology_mark rather then the global field. And maybe place it into pytest_runtest_call? It may solve the output problem that you see.

@danlavu
Copy link
Author

danlavu commented Feb 26, 2025

I'm not a big fun of single_run. How about preferred_topology? Its still not perfect, so suggestion is welcomed. You should also not take the string but the topology marker object instead. It would be also good to allow lists. That is:

Linguistically, preferred_topology is unclear. It sounds like it makes the defined topology run first, while in reality, the new default is that it only runs against the selected topology. Is it the phrase or one of the words in particular? Maybe the negation sounds better, @mark.skip_extra_topologies? I want it to be clear that it skips/bypasses/ignores/omits additional topologies and is an exclusive/unique/single/solo/distinctive run. One of those words might make it sound better?

@pbrezina
Copy link
Contributor

I think "single run" does not describe it at all. Negation would be easier to describe, I think, but harder to use (as the use case is to run 1 instead of 4).

The use case is "run these topologies if there is no topology filter". default_selection, default_topology, default_topology_selection? We probably can't find short phrase that does not have to be backed up by documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants