Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Template tests #163

Open
wants to merge 18 commits into
base: dev
Choose a base branch
from
Open

Template tests #163

wants to merge 18 commits into from

Conversation

tiagofilipe12
Copy link
Collaborator

@tiagofilipe12 tiagofilipe12 commented Nov 5, 2018

This PR is a first implementation of the tests for python templates available in flowcraft (related with issue #148 ). The basic idea is to place each test under flowcraft.tests.template_tests in order to separate these tests from the core flowcraft tests.

For now I have just added some tests to the mash_screen, mash_dist and mapping_patlas components, for example to check if files are generated. More complex templates might require other kinds of tests. But the idea here is to make your python templates as tested as possible in order to allow others to contribute without breaking functionality of the components.

I have also changed the import in all templates so that they can be imported by pytest while running the tests.


This change is Reviewable

@codecov-io
Copy link

codecov-io commented Nov 5, 2018

Codecov Report

Merging #163 into dev will increase coverage by 8.33%.
The diff coverage is 78.74%.

Impacted file tree graph

@@            Coverage Diff             @@
##              dev     #163      +/-   ##
==========================================
+ Coverage   43.67%   52.01%   +8.33%     
==========================================
  Files          64       70       +6     
  Lines        5946     6287     +341     
==========================================
+ Hits         2597     3270     +673     
+ Misses       3349     3017     -332
Impacted Files Coverage Δ
flowcraft/tests/test_pipeline_parser.py 100% <ø> (ø) ⬆️
flowcraft/tests/test_sanity.py 96.15% <ø> (ø) ⬆️
flowcraft/templates/process_newick.py 0% <0%> (ø) ⬆️
flowcraft/templates/pipeline_status.py 0% <0%> (ø) ⬆️
flowcraft/templates/trimmomatic.py 0% <0%> (ø) ⬆️
flowcraft/templates/fastqc_report.py 0% <0%> (ø) ⬆️
flowcraft/templates/process_mapping.py 0% <0%> (ø) ⬆️
flowcraft/templates/downsample_fastq.py 0% <0%> (ø) ⬆️
flowcraft/templates/process_assembly_mapping.py 0% <0%> (ø) ⬆️
flowcraft/templates/process_viral_assembly.py 0% <0%> (ø) ⬆️
... and 36 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a3eb1af...f34f0ab. Read the comment docs.

Copy link
Collaborator

@ODiogoSilva ODiogoSilva left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a great effort indeed. We'll need to be pumping out tests for the remaining templates. My only major concern is that some of the tests do not appear to be self contained? Or that depend on some execution made outside the test?

changelog.md Outdated Show resolved Hide resolved
}
}

#
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

?!

flowcraft/tests/template_tests/test_mapping2json.py Outdated Show resolved Hide resolved

def test_generate_jsons():
"""
Test if the returns from this function are the expected results
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We haven't done this so far, but it would be good to have more informative comments. Ideally:

  • What is this function testing
  • What is the expected result

assert os.path.isfile("{}_mapping.json".format(depth_file))


def test_generate_report(fetch_file):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this test depends on the execution of another?

@ODiogoSilva ODiogoSilva self-assigned this Nov 18, 2018
@tiagofilipe12
Copy link
Collaborator Author

tiagofilipe12 commented Jan 21, 2019

Does anyone still remember why is this pending? 😆

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants