-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat/fix: Add Additional Testing Files and Fix the Coverage Menu for Execexam #38
feat/fix: Add Additional Testing Files and Fix the Coverage Menu for Execexam #38
Conversation
@gkapfham Yes, I briefly spoke with @Chezka109. I thought that the plan was to merge her branch into our branch in the fork so that there are no merge conflicts when we merge her PR and then our PR for the main execexam tool. We also needed to ensure that the 2 PR's work together. What do you think? |
@gkapfham, I just tested out if the coverage in this branch properly updates the coverage badge and I can now say with 100% confidence that it works. When a user runs the command for coverage: |
Hi, I am currently reviewing this PR on behalf of Windows since I don't think we have had a Windows user yet. |
So on windows I am running into one test failure. In test_main.py it is not getting the same exit code as the one being asserted Here is the error
Here is the full terminal output containing the error
|
I am on Windows 10 if that makes a difference. I see that it passed the Windows env in the GitHub Actions so I am curious what the root cause of this may be. I will also note I made sure that I was in the correct branch. |
I just reran both of the commands on my computer linux and it was fine no errors |
Hello @hannahb09 and @Coltin2121 and @dyga01, please note that there is a report that this test suite does not run correctly on Windows. Can you please investigate this issue? |
@gkapfham I believe none of us are using windows os, so it might be better if we could work with someone on that os to resolve this issue. |
@hannahb09, @dyga01, @Coltin2121, please review the recent updates to |
@AlishChhetri I just ran these changes on Mac OS and they appear to be working. |
@dyga01, once a Linux user verifies that these changes work, we’ll have confirmed functionality across all three major operating systems. I’ll approve the PR following that round of testing. |
Here is the output of my usage from Ubuntu caleb@:execexam (coverage_menu_fixes)$ poetry run task coverage
=============================================================================================== test session starts ===============================================================================================
platform linux -- Python 3.11.5, pytest-8.3.2, pluggy-1.5.0
Using --randomly-seed=826929214
rootdir: /home/caleb/Allegheny/CMP203/execexam
configfile: pytest.ini
plugins: anyio-4.4.0, json-report-1.5.0, cov-4.1.0, clarity-1.0.1, metadata-3.1.1, randomly-3.15.0, hypothesis-6.112.0
collected 58 items
tests/test_display.py ......
tests/test_util.py ......
tests/test_enumerations.py .........
tests/test_convert.py .
tests/test_debug.py .......
tests/test_extract.py ...............
tests/test_pytest_plugin.py .....
tests/test_main.py .
tests/test_advise.py ........
---------- coverage: platform linux, python 3.11.5-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing
-----------------------------------------------------------------------
execexam/advise.py 64 34 20 3 42% 25, 38-42, 84-89, 108-113, 123-128, 145-254
execexam/convert.py 7 0 2 0 100%
execexam/debug.py 22 0 4 0 100%
execexam/display.py 54 20 18 2 69% 35-83, 119->exit, 162-168
execexam/enumerations.py 17 0 0 0 100%
execexam/extract.py 80 0 29 1 99% 89->80
execexam/main.py 112 37 26 8 62% 44-45, 107, 115-116, 161-179, 228->233, 267-305, 329-365, 380-383
execexam/pytest_plugin.py 88 74 30 0 12% 35-43, 50-83, 91, 101-104, 114-117, 125-199, 212-247
execexam/util.py 14 0 10 0 100%
-----------------------------------------------------------------------
TOTAL 458 165 139 14 62%
1 empty file skipped.
Coverage JSON written to file coverage.json
Required test coverage of 50% reached. Total coverage: 61.64%
=============================================================================================== 58 passed in 17.90s ===============================================================================================
caleb@:execexam (coverage_menu_fixes)$ poetry run task test
=============================================================================================== test session starts ===============================================================================================
platform linux -- Python 3.11.5, pytest-8.3.2, pluggy-1.5.0 -- /home/caleb/.cache/pypoetry/virtualenvs/execexam-fnP_q_NH-py3.11/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.11.5', 'Platform': 'Linux-6.8.0-48-generic-x86_64-with-glibc2.35', 'Packages': {'pytest': '8.3.2', 'pluggy': '1.5.0'}, 'Plugins': {'anyio': '4.4.0', 'json-report': '1.5.0', 'cov': '4.1.0', 'clarity': '1.0.1', 'metadata': '3.1.1', 'randomly': '3.15.0', 'hypothesis': '6.112.0'}}
Using --randomly-seed=4222188027
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase(PosixPath('/home/caleb/Allegheny/CMP203/execexam/.hypothesis/examples'))
rootdir: /home/caleb/Allegheny/CMP203/execexam
configfile: pytest.ini
plugins: anyio-4.4.0, json-report-1.5.0, cov-4.1.0, clarity-1.0.1, metadata-3.1.1, randomly-3.15.0, hypothesis-6.112.0
collected 58 items
tests/test_extract.py::test_is_failing_test_details_empty_with_empty_string PASSED
tests/test_extract.py::test_no_labels PASSED
tests/test_extract.py::test_extract_details_hypothesis PASSED
tests/test_extract.py::test_extract_test_run_details PASSED
tests/test_extract.py::test_extract_failing_test_details PASSED
tests/test_extract.py::test_extract_test_output_with_label PASSED
tests/test_extract.py::test_multiple_labels PASSED
tests/test_extract.py::test_extract_test_assertion_details PASSED
tests/test_extract.py::test_is_failing_test_details_empty_with_non_empty_string PASSED
tests/test_extract.py::test_extract_test_assertion_details_list PASSED
tests/test_extract.py::test_is_failing_test_details_empty_with_newline PASSED
tests/test_extract.py::test_extract_details PASSED
tests/test_extract.py::test_extract_test_output_without_label PASSED
tests/test_extract.py::test_extract_test_assertions_details PASSED
tests/test_extract.py::test_single_label PASSED
tests/test_debug.py::test_has_debugging_messages PASSED
tests/test_debug.py::test_add_message PASSED
tests/test_debug.py::test_clear_messages PASSED
tests/test_debug.py::test_debug_function PASSED
tests/test_debug.py::test_enum_values PASSED
tests/test_debug.py::test_get_debugging_messages PASSED
tests/test_debug.py::test_messages_list_initially_empty PASSED
tests/test_convert.py::test_path_to_string PASSED
tests/test_enumerations.py::test_advice_method_enum_values PASSED
tests/test_enumerations.py::test_report_type_enum_access_by_name PASSED
tests/test_enumerations.py::test_report_type_enum_values PASSED
tests/test_enumerations.py::test_advice_method_enum_access_by_name PASSED
tests/test_enumerations.py::test_report_type_enum_invalid_name PASSED
tests/test_enumerations.py::test_theme_enum_values PASSED
tests/test_enumerations.py::test_theme_enum_invalid_name PASSED
tests/test_enumerations.py::test_theme_enum_access_by_name PASSED
tests/test_enumerations.py::test_advice_method_enum_invalid_name PASSED
tests/test_pytest_plugin.py::test_pytest_assertion_pass_no_assertions PASSED
tests/test_pytest_plugin.py::test_pytest_assertion_pass PASSED
tests/test_pytest_plugin.py::test_pytest_assertion_pass_no_report PASSED
tests/test_pytest_plugin.py::test_pytest_runtest_logreport_not_call PASSED
tests/test_pytest_plugin.py::test_pytest_runtest_logreport PASSED
tests/test_display.py::test_display_content_plain_text PASSED
tests/test_display.py::test_display_content PASSED
tests/test_display.py::test_get_display_return_code PASSED
tests/test_display.py::test_display_advice PASSED
tests/test_display.py::test_make_colon_separated_string PASSED
tests/test_display.py::test_display_content_no_newline PASSED
tests/test_advise.py::test_check_internet_connection_timeout PASSED
tests/test_advise.py::test_check_internet_connection_different_dns[dns_server0] PASSED
tests/test_advise.py::test_check_internet_connection_failure PASSED
tests/test_advise.py::test_check_internet_connection_different_dns[dns_server1] PASSED
tests/test_advise.py::test_check_internet_connection_success PASSED
tests/test_advise.py::test_check_internet_connection_different_dns[dns_server2] PASSED
tests/test_advise.py::test_validate_url PASSED
tests/test_advise.py::test_check_internet_connection_different_dns[dns_server3] PASSED
tests/test_util.py::test_determine_execexam_return_code_other PASSED
tests/test_util.py::test_determine_execexam_return_code_no_tests_collected PASSED
tests/test_util.py::test_determine_execexam_return_code_interrupted PASSED
tests/test_util.py::test_determine_execexam_return_code_tests_failed PASSED
tests/test_util.py::test_determine_execexam_return_code_internal_error PASSED
tests/test_util.py::test_determine_execexam_return_code_usage_error PASSED
tests/test_main.py::test_run_with_missing_test PASSED
=============================================================================================== 58 passed in 9.88s ================================================================================================
caleb@:execexam (coverage_menu_fixes)$ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. test functionality works for Ubuntu here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thank you to Caleb for testing out this feature on Linux (Ubuntu)!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am glad the changes I made worked for everyone else it seems. All the other work looks good to me!
The command |
I know I am someone who worked on the PR but with the updates from Pallas they do work on my linux computer |
Pull Request: Add additional testing files and fix the coverage menu for execexam
Aidan Dyga (@dyga01), Hannah Brown (@hannahb09 ), Coltin Colucci (@Coltin2121)
Additional and Improved Tests and Fuzzing #4 and Bug: Accurate Coverage Checks #35
test, enhancement
This pull request aims to add a large amount of testing files and functions to execexam. Most of the tests were created with Pytest. We also worked to fix the coverage menu to provide accurate coverage and omit files that do not have tests from the coverage report. Overall, this pull request mainly aims to add initial test cases and fix the coverage menu.
Coverage of the test cases in our branch is currently passing. However, this number is expected to drop as our tests are integrated with the rest of the updated tool. Once most of the feature PRs have been added, we will work to improve the coverage again.
The tests have been conducted on a Mac. It would be great if I could have a Linux and Windows user test it out.
Here is the output of running the following commands on Mac.
a. poetry run task test
b. poetry run task coverage