Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add Initial Test Cases and Increase Coverage #32

Closed
wants to merge 79 commits into from

Conversation

dyga01
Copy link
Collaborator

@dyga01 dyga01 commented Oct 2, 2024

  1. Pull Request: Add additional testing files and functions for execexam, so we have greater code coverage.

  2. Aidan Dyga (@dyga01), Hannah Brown (@hannahb09 ), Coltin Colucci (@Coltin2121)

  3. Additional and Improved Tests and Fuzzing #4

  4. test, enhancement

  5. This pull request aims to add a large amount of testing files and functions. Most of the tests were created with Pytest, with some Hypothesis test cases as well. Overall, this pull request mainly aims to add initial test cases for a lot of features.

  6. Coverage of the test cases in our branch is currently at 99%. However, this number is expected to drop as our tests are integrated with the rest of the updated tool. Once most of the feature PRs have been added, we will work to improve the coverage again.

  7. The tests have been conducted on a Mac. It would be great if I could have a Linux and Windows user test it out.

  8. Here is the output of running the following commands on Mac.

a. poetry run task test
Screenshot 2024-10-02 at 4 07 51 PM

b. poetry run task coverage
Screenshot 2024-10-02 at 4 08 37 PM

PCain02 and others added 30 commits September 5, 2024 15:33
…checking, python support, and dependency install
@dyga01 dyga01 added test Test cases or test running enhancement New feature or request labels Oct 2, 2024
@dyga01 dyga01 changed the title Feat: Add Initial Test Cases and Increase Coverage feat: Add Initial Test Cases and Increase Coverage Oct 2, 2024
Copy link
Collaborator

@hemanialaparthi hemanialaparthi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dyga01, I tested the commands that were provided: poetry run task test and poetry run task coverage on a Mac and I got both the intended outputs!

  1. poetry run task test
============================================================================ test session starts =============================================================================
platform darwin -- Python 3.12.6, pytest-8.3.3, pluggy-1.5.0 -- /Users/hemanialaparthi/Library/Caches/pypoetry/virtualenvs/execexam-nU8we_eK-py3.12/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.12.6', 'Platform': 'macOS-14.6.1-arm64-arm-64bit', 'Packages': {'pytest': '8.3.3', 'pluggy': '1.5.0'}, 'Plugins': {'json-report': '1.5.0', 'metadata': '3.1.1', 'randomly': '3.15.0', 'anyio': '4.6.0', 'cov': '4.1.0', 'clarity': '1.0.1', 'hypothesis': '6.112.1'}}
Using --randomly-seed=142332441
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase(PosixPath('/Users/hemanialaparthi/Desktop/cmpsc203/enchancement/execexam/.hypothesis/examples'))
rootdir: /Users/hemanialaparthi/Desktop/cmpsc203/enchancement/execexam
configfile: pytest.ini
plugins: json-report-1.5.0, metadata-3.1.1, randomly-3.15.0, anyio-4.6.0, cov-4.1.0, clarity-1.0.1, hypothesis-6.112.1
collected 46 items                                                                                                                                                           

tests/test_debug.py::test_get_debugging_messages PASSED
tests/test_debug.py::test_add_message PASSED
tests/test_debug.py::test_clear_messages PASSED
tests/test_debug.py::test_has_debugging_messages PASSED
tests/test_debug.py::test_enum_values PASSED
tests/test_debug.py::test_messages_list_initially_empty PASSED
tests/test_debug.py::test_debug_function PASSED
tests/test_convert.py::test_path_to_string PASSED
tests/test_util.py::test_determine_execexam_return_code_no_tests_collected PASSED
tests/test_util.py::test_determine_execexam_return_code_usage_error PASSED
tests/test_util.py::test_determine_execexam_return_code_other PASSED
tests/test_util.py::test_determine_execexam_return_code_tests_failed PASSED
tests/test_util.py::test_determine_execexam_return_code_internal_error PASSED
tests/test_util.py::test_determine_execexam_return_code_interrupted PASSED
tests/test_enumerations.py::test_report_type_enum_invalid_name PASSED
tests/test_enumerations.py::test_theme_enum_invalid_name PASSED
tests/test_enumerations.py::test_theme_enum_access_by_name PASSED
tests/test_enumerations.py::test_advice_method_enum_access_by_name PASSED
tests/test_enumerations.py::test_report_type_enum_access_by_name PASSED
tests/test_enumerations.py::test_report_type_enum_values PASSED
tests/test_enumerations.py::test_advice_method_enum_invalid_name PASSED
tests/test_enumerations.py::test_theme_enum_values PASSED
tests/test_enumerations.py::test_advice_method_enum_values PASSED
tests/test_display.py::test_display_advice PASSED
tests/test_display.py::test_make_colon_separated_string PASSED
tests/test_display.py::test_get_display_return_code PASSED
tests/test_pytest_plugin.py::test_pytest_runtest_logreport PASSED
tests/test_pytest_plugin.py::test_pytest_assertion_pass_no_assertions PASSED
tests/test_pytest_plugin.py::test_pytest_assertion_pass_no_report PASSED
tests/test_pytest_plugin.py::test_pytest_assertion_pass PASSED
tests/test_pytest_plugin.py::test_pytest_runtest_logreport_not_call PASSED
tests/test_extract.py::test_no_labels PASSED
tests/test_extract.py::test_extract_details PASSED
tests/test_extract.py::test_is_failing_test_details_empty_with_non_empty_string PASSED
tests/test_extract.py::test_multiple_labels PASSED
tests/test_extract.py::test_extract_details_hypothesis PASSED
tests/test_extract.py::test_extract_test_output_without_label PASSED
tests/test_extract.py::test_single_label PASSED
tests/test_extract.py::test_extract_failing_test_details PASSED
tests/test_extract.py::test_extract_test_assertions_details PASSED
tests/test_extract.py::test_extract_test_assertion_details_list PASSED
tests/test_extract.py::test_extract_test_run_details PASSED
tests/test_extract.py::test_extract_test_output_with_label PASSED
tests/test_extract.py::test_is_failing_test_details_empty_with_empty_string PASSED
tests/test_extract.py::test_is_failing_test_details_empty_with_newline PASSED
tests/test_extract.py::test_extract_test_assertion_details PASSED

============================================================================= 46 passed in 0.17s =============================================================================
  1. poetry run task coverage
============================================================================ test session starts =============================================================================
platform darwin -- Python 3.12.6, pytest-8.3.3, pluggy-1.5.0
Using --randomly-seed=1175480625
rootdir: /Users/hemanialaparthi/Desktop/cmpsc203/enchancement/execexam
configfile: pytest.ini
plugins: json-report-1.5.0, metadata-3.1.1, randomly-3.15.0, anyio-4.6.0, cov-4.1.0, clarity-1.0.1, hypothesis-6.112.1
collected 46 items                                                                                                                                                           

tests/test_pytest_plugin.py .....
tests/test_util.py ......
tests/test_enumerations.py .........
tests/test_extract.py ...............
tests/test_display.py ...
tests/test_debug.py .......
tests/test_convert.py .

---------- coverage: platform darwin, python 3.12.6-final-0 ----------
Name                          Stmts   Miss Branch BrPart  Cover   Missing
-------------------------------------------------------------------------
execexam/__init__.py              0      0      0      0   100%
execexam/convert.py               7      0      2      0   100%
execexam/debug.py                22      0      4      0   100%
execexam/enumerations.py         17      0      0      0   100%
execexam/extract.py              80      0     29      1    99%   89->80
execexam/util.py                 14      0     10      0   100%
tests/__init__.py                 0      0      0      0   100%
tests/test_advise.py              0      0      0      0   100%
tests/test_convert.py            12      0      0      0   100%
tests/test_debug.py              32      0      2      0   100%
tests/test_display.py            46      0      8      0   100%
tests/test_enumerations.py       41      0      6      0   100%
tests/test_extract.py            85      0     12      0   100%
tests/test_main.py                0      0      0      0   100%
tests/test_pytest_plugin.py      67      0      8      1    99%   18->20
tests/test_util.py               14      0      0      0   100%
-------------------------------------------------------------------------
TOTAL                           437      0     81      2    99%
Coverage JSON written to file coverage.json

Required test coverage of 50% reached. Total coverage: 99.61%

============================================================================= 46 passed in 0.32s =============================================================================

Copy link
Collaborator

@CalebKendra CalebKendra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR needs some changes before approval, to start

  • coverage outputs are displayed incorrectly, leading me to believe that these files are not being tested correctly by pytest. Here is what our poetry run task coverage test looks like on execexam:

Screenshot from 2024-10-17 15-19-31

  • Here is an example from a previous project, Chasten as to what it should look like:

Screenshot from 2024-10-17 15-16-48

  • Another issue is that the test_coverage.py file contains the same functions that it is testing. This file should import from the coverage.py file instead.

@gkapfham
Copy link
Collaborator

How does this PR connect to PR #38?

@gkapfham
Copy link
Collaborator

@dyga01 please note that there are conflicts in this branch.

@hannahb09 hannahb09 marked this pull request as draft October 24, 2024 19:07
@gkapfham
Copy link
Collaborator

@dyga01 can this PR be closed?

@dyga01 dyga01 closed this Oct 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request test Test cases or test running
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants