Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release v0.21.1 #298

Merged
merged 14 commits into from
Dec 17, 2024
2 changes: 1 addition & 1 deletion .github/workflows/changelog.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ on:

jobs:
call-changelog-check-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-changelog-check.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-changelog-check.yml@v0.12.0
2 changes: 1 addition & 1 deletion .github/workflows/create-jira-issue.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ on:

jobs:
call-create-jira-issue-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-create-jira-issue.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-create-jira-issue.yml@v0.12.0
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/labeled-pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@ on:

jobs:
call-labeled-pr-check-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-labeled-pr-check.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-labeled-pr-check.yml@v0.12.0
2 changes: 1 addition & 1 deletion .github/workflows/release-template-comment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ on:

jobs:
call-release-checklist-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-release-checklist-comment.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-release-checklist-comment.yml@v0.12.0
secrets:
USER_TOKEN: ${{ secrets.GITHUB_TOKEN }}
2 changes: 1 addition & 1 deletion .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:

jobs:
call-release-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-release.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-release.yml@v0.12.0
with:
release_prefix: HyP3 autoRIFT
secrets:
Expand Down
13 changes: 6 additions & 7 deletions .github/workflows/static-analysis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,10 @@ name: Static analysis
on: push

jobs:
call-flake8-workflow:
uses: ASFHyP3/actions/.github/workflows/[email protected]
with:
local_package_names: hyp3_autorift
excludes: src/hyp3_autorift/vend

call-secrets-analysis-workflow:
uses: ASFHyP3/actions/.github/workflows/[email protected]
# Docs: https://github.com/ASFHyP3/actions
uses: ASFHyP3/actions/.github/workflows/[email protected]

call-ruff-workflow:
# Docs: https://github.com/ASFHyP3/actions
uses: ASFHyP3/actions/.github/workflows/[email protected]
2 changes: 1 addition & 1 deletion .github/workflows/tag-version.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ on:

jobs:
call-bump-version-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-bump-version.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-bump-version.yml@v0.12.0
secrets:
USER_TOKEN: ${{ secrets.TOOLS_BOT_PAK }}
6 changes: 3 additions & 3 deletions .github/workflows/test-and-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,20 +12,20 @@ on:

jobs:
call-pytest-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-pytest.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-pytest.yml@v0.12.0
with:
local_package_name: hyp3_autorift
python_versions: >-
["3.9"]

call-version-info-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-version-info.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-version-info.yml@v0.12.0
with:
python_version: '3.9'

call-docker-ghcr-workflow:
needs: call-version-info-workflow
uses: ASFHyP3/actions/.github/workflows/reusable-docker-ghcr.yml@v0.11.2
uses: ASFHyP3/actions/.github/workflows/reusable-docker-ghcr.yml@v0.12.0
with:
version_tag: ${{ needs.call-version-info-workflow.outputs.version_tag }}
secrets:
Expand Down
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.21.1]
### Changed
- The [`static-analysis`](.github/workflows/static-analysis.yml) Github Actions workflow now uses `ruff` rather than `flake8` for linting.

## [0.21.0]
### Added
* Logger is now configured in process.main() so paths to reference/secondary scenes will now be logged.
Expand Down
5 changes: 1 addition & 4 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,7 @@ dependencies:
- pip
# For packaging, and testing
- python-build
- flake8
- flake8-import-order
- flake8-blind-except
- flake8-builtins
- ruff
- pillow
- pytest
- pytest-console-scripts
Expand Down
33 changes: 29 additions & 4 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,7 @@ dynamic = ["version"]

[project.optional-dependencies]
develop = [
'flake8',
'flake8-import-order',
'flake8-blind-except',
'flake8-builtins',
'ruff',
'pillow',
'pytest',
'pytest-cov',
Expand All @@ -66,3 +63,31 @@ zip-safe = false
where = ["src"]

[tool.setuptools_scm]

[tool.ruff]
exclude = ["vend"]
line-length = 120
# The directories to consider when resolving first- vs. third-party imports.
# See: https://docs.astral.sh/ruff/settings/#src
src = ["src", "tests"]

[tool.ruff.format]
indent-style = "space"
quote-style = "single"

[tool.ruff.lint]
extend-select = [
"I", # isort: https://docs.astral.sh/ruff/rules/#isort-i
# TODO: Uncomment the following extensions and address their warnings:
# "UP", # pyupgrade: https://docs.astral.sh/ruff/rules/#pyupgrade-up
# "D", # pydocstyle: https://docs.astral.sh/ruff/rules/#pydocstyle-d
# "ANN", # annotations: https://docs.astral.sh/ruff/rules/#flake8-annotations-ann
# "PTH", # use-pathlib-pth: https://docs.astral.sh/ruff/rules/#flake8-use-pathlib-pth
]

[tool.ruff.lint.pydocstyle]
convention = "google"

[tool.ruff.lint.isort]
case-sensitive = true
lines-after-imports = 2
13 changes: 8 additions & 5 deletions src/hyp3_autorift/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,17 @@

from importlib.metadata import PackageNotFoundError, version


try:
__version__ = version(__name__)
except PackageNotFoundError:
print(f'{__name__} package is not installed!\n'
f'Install in editable/develop mode via (from the top of this repo):\n'
f' python -m pip install -e .[develop]\n'
f'Or, to just get the version number use:\n'
f' python setup.py --version')
print(
f'{__name__} package is not installed!\n'
f'Install in editable/develop mode via (from the top of this repo):\n'
f' python -m pip install -e .[develop]\n'
f'Or, to just get the version number use:\n'
f' python setup.py --version'
)

__all__ = [
'__version__',
Expand Down
11 changes: 5 additions & 6 deletions src/hyp3_autorift/__main__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
AutoRIFT processing for HyP3
"""


import argparse
import os
import sys
Expand All @@ -16,8 +15,10 @@
def main():
parser = argparse.ArgumentParser(prefix_chars='+', formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument(
'++process', choices=['hyp3_autorift', 's1_correction'], default='hyp3_autorift',
help='Select the console_script entrypoint to use' # console_script entrypoints are specified in `setup.py`
'++process',
choices=['hyp3_autorift', 's1_correction'],
default='hyp3_autorift',
help='Select the console_script entrypoint to use', # console_script entrypoints are specified in `setup.py`
)
parser.add_argument('++omp-num-threads', type=int, help='The number of OpenMP threads to use for parallel regions')

Expand All @@ -38,9 +39,7 @@ def main():
(process_entry_point,) = {process for process in eps if process.name == args.process}

sys.argv = [args.process, *unknowns]
sys.exit(
process_entry_point.load()()
)
sys.exit(process_entry_point.load()())


if __name__ == '__main__':
Expand Down
13 changes: 5 additions & 8 deletions src/hyp3_autorift/crop.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,8 @@
import pyproj
import xarray as xr

ENCODING_ATTRS = ['_FillValue', 'dtype', "zlib", "complevel", "shuffle", 'add_offset', 'scale_factor']

ENCODING_ATTRS = ['_FillValue', 'dtype', 'zlib', 'complevel', 'shuffle', 'add_offset', 'scale_factor']


def crop_netcdf_product(netcdf_file: Path) -> Path:
Expand All @@ -58,7 +59,7 @@ def crop_netcdf_product(netcdf_file: Path) -> Path:
# Based on X/Y extends, mask original dataset
mask_lon = (ds.x >= grid_x_min) & (ds.x <= grid_x_max)
mask_lat = (ds.y >= grid_y_min) & (ds.y <= grid_y_max)
mask = (mask_lon & mask_lat)
mask = mask_lon & mask_lat

cropped_ds = ds.where(mask).dropna(dim='x', how='all').dropna(dim='y', how='all')
cropped_ds = cropped_ds.load()
Expand All @@ -74,11 +75,7 @@ def crop_netcdf_product(netcdf_file: Path) -> Path:

# Convert to lon/lat coordinates
projection = ds['mapping'].attrs['spatial_epsg']
to_lon_lat_transformer = pyproj.Transformer.from_crs(
f"EPSG:{projection}",
'EPSG:4326',
always_xy=True
)
to_lon_lat_transformer = pyproj.Transformer.from_crs(f'EPSG:{projection}', 'EPSG:4326', always_xy=True)

# Update centroid information for the granule
center_lon_lat = to_lon_lat_transformer.transform(center_x, center_y)
Expand All @@ -91,7 +88,7 @@ def crop_netcdf_product(netcdf_file: Path) -> Path:
y_cell = y_values[1] - y_values[0]

# It was decided to keep all values in GeoTransform center-based
cropped_ds['mapping'].attrs['GeoTransform'] = f"{x_values[0]} {x_cell} 0 {y_values[0]} 0 {y_cell}"
cropped_ds['mapping'].attrs['GeoTransform'] = f'{x_values[0]} {x_cell} 0 {y_values[0]} 0 {y_cell}'

# Compute chunking like AutoRIFT does:
# https://github.com/ASFHyP3/hyp3-autorift/blob/develop/hyp3_autorift/vend/netcdf_output.py#L410-L411
Expand Down
9 changes: 5 additions & 4 deletions src/hyp3_autorift/geometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,15 @@
import logging
from typing import Tuple

from osgeo import ogr
from osgeo import osr
from osgeo import ogr, osr


log = logging.getLogger(__name__)


def polygon_from_bbox(x_limits: Tuple[float, float], y_limits: Tuple[float, float],
epsg_code: int = 4326) -> ogr.Geometry:
def polygon_from_bbox(
x_limits: Tuple[float, float], y_limits: Tuple[float, float], epsg_code: int = 4326
) -> ogr.Geometry:
ring = ogr.Geometry(ogr.wkbLinearRing)
ring.AddPoint_2D(x_limits[0], y_limits[1])
ring.AddPoint_2D(x_limits[1], y_limits[1])
Expand Down
64 changes: 34 additions & 30 deletions src/hyp3_autorift/image.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,38 +6,42 @@
from matplotlib.colors import LinearSegmentedColormap
from scipy.interpolate import PchipInterpolator

COLOR_MAP = np.array([
# data value, R, G, B, A
[0, 255, 255, 255, 0],
[2, 166, 238, 255, 255],
[4, 97, 195, 219, 255],
[9, 84, 169, 254, 255],
[16, 84, 130, 254, 255],
[25, 84, 85, 254, 255],
[36, 50, 119, 220, 255],
[49, 16, 153, 186, 255],
[64, 16, 186, 153, 255],
[81, 50, 220, 119, 255],
[100, 84, 254, 85, 255],
[121, 118, 221, 51, 255],
[144, 153, 186, 16, 255],
[169, 187, 152, 17, 255],
[196, 221, 118, 51, 255],
[225, 255, 85, 85, 255],
[289, 255, 25, 85, 255],
[324, 213, 1, 72, 255],
[361, 158, 1, 66, 255],
[400, 140, 0, 51, 255],
[441, 122, 0, 166, 255],
[484, 140, 0, 191, 255],
[529, 159, 0, 217, 255],
[576, 213, 0, 255, 255],
[625, 255, 0, 138, 255],
])

COLOR_MAP = np.array(
[
# data value, R, G, B, A
[0, 255, 255, 255, 0],
[2, 166, 238, 255, 255],
[4, 97, 195, 219, 255],
[9, 84, 169, 254, 255],
[16, 84, 130, 254, 255],
[25, 84, 85, 254, 255],
[36, 50, 119, 220, 255],
[49, 16, 153, 186, 255],
[64, 16, 186, 153, 255],
[81, 50, 220, 119, 255],
[100, 84, 254, 85, 255],
[121, 118, 221, 51, 255],
[144, 153, 186, 16, 255],
[169, 187, 152, 17, 255],
[196, 221, 118, 51, 255],
[225, 255, 85, 85, 255],
[289, 255, 25, 85, 255],
[324, 213, 1, 72, 255],
[361, 158, 1, 66, 255],
[400, 140, 0, 51, 255],
[441, 122, 0, 166, 255],
[484, 140, 0, 191, 255],
[529, 159, 0, 217, 255],
[576, 213, 0, 255, 255],
[625, 255, 0, 138, 255],
]
)

def make_browse(out_file: Path, data: np.ndarray,
min_value: Optional[float] = None, max_value: Optional[float] = 625.) -> Path:

def make_browse(
out_file: Path, data: np.ndarray, min_value: Optional[float] = None, max_value: Optional[float] = 625.0
) -> Path:
data_values = COLOR_MAP[:, 0]
pchip = PchipInterpolator(data_values, np.linspace(0, 1, len(data_values)))
image = pchip(np.clip(data, min_value, max_value))
Expand Down
Loading
Loading