-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hv check test suite #120
Hv check test suite #120
Conversation
Please take a look and let me know what you think. The changes should be in test_unit_patch_extractors |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding skip to all the current integration tests seems dangerous to me, as development of new features might not be tested properly while we are writing the unit tests for this functionality.
Perhaps we can "simply" replace the execution on openeo with a check on the PG, or even just doe a pre-flight check contained in the python client, so we don't fully lose these tests.
Furthermore it is not entirely clear to me what the tests in test_unit_patch_extractors is supposed to test. In general test files should follow the convention of test_<file_to_be_tested>.py so unit tests clearly point to the code being tested.
In this case the file would be test_feature_extractor.py with perhaps a class combining the tests for the path extractors.
|
||
|
||
# Helper functions to test excecute | ||
def create_mock_common_preparations(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems to be a constant, perhaps it is easier to create it as a constant; or even better as a pytest fixture?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will set it as a fixture, it does need to be a function
return xr.DataArray(data, dims=["bands", "t", "y", "x"]) | ||
|
||
|
||
def create_mock_rescale_s1(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems to be a constant, perhaps it is easier to create it as a constant; or even better as a pytest fixture?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will set it as a fixture, it does need to be a function
|
||
|
||
# test excecute | ||
@patch.object( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I understand correctly, patch.object is supposed to mock the output of one function, in case you don't want to mock an entire object. Here you are mocking a function of a dummy object. I think it is more clear to just adjust your dummy object.
Source: https://realpython.com/python-mock-library/#patching-an-objects-attributes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will take a look
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For this test I mock the actual functions in the actual PatchExtractor.
However, be defining these in the DummyPatchExtractor, you will no longer be able to test the behavior of the real functionalities through the Dummy class
"_rescale_s1_backscatter", | ||
return_value=create_mock_rescale_s1(), | ||
) | ||
def test_execute(mock_common_preparations, mock_rescale_s1): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does this test test exactly? I don't see any gfmap functionality being used
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it tests whether execute actually calls common_preperations and rescale_s1.
It is similar to testing whether a process graph remained the same
test_feature extractor already exists (these are currently integration tests). For now I will opt to add unit in the name such that we can split them up accordingly later on |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This PR adds a lot of TODO's to the code. I think these should be added to (multiple) issues so we don't lose track of them, can plan them and prevent our code from becomming a TODO forest
temp.nc
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this file be commited?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
probably not
I have included a unit test for the patch extractions.
When disabling the integration tests, the current suite runs in 22 s