Skip to content

Commit

Permalink
Merge pull request #23 from asfadmin/feature-core-search
Browse files Browse the repository at this point in the history
Feature core search
  • Loading branch information
glshort authored Apr 15, 2021
2 parents 93a8f32 + acbdc0e commit e7bfe5f
Show file tree
Hide file tree
Showing 15 changed files with 167 additions and 53 deletions.
15 changes: 13 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Changelog

## [0.3.1](https://github.com/asfadmin/Discovery-asf_search/compare/v0.2.4...v0.3.1)
## [0.3.2](https://github.com/asfadmin/Discovery-asf_search/compare/v0.2.4...v0.3.2)

### Added
- Layed out framework for INSTRUMENT constants (needs to be populated)
Expand All @@ -14,6 +14,15 @@
- ASFSearchResults now has a geojson() method which returns a data structure that matches the geojson specification
- ASFProduct now has a geojson() method that produces a data structure matching a geojson feature snippet
- ASFSearchResults and ASFProduct both have a __str__() methods that serializes the output of their geojson() methods
- Added CodeFactor shield to readme
- Now calculates temporal baselines when building a stack
- New search options:
- min/maxDoppler
- min/MaxFaradayRotation
- flightLine
- offNadirAngle
- season


### Changed
- No longer uses range type for parameters that accept lists of values and/or ranges. Now expects a 2-value tuple.
Expand All @@ -24,12 +33,14 @@
- Flatter structure for constants
- baseline functionality moved into search group (file restructuring)
- ASFProduct is no longer a subclass of dict. Instead, metadata has been moved to .properties and .geometry
- ASFSearchResults is now a subclass of list, for list-like operations
- ASFSearchResults is now a subclass of UserList, for list-type operations
- Newly-built stacks are sorted by temporal baselines, ascending

### Fixed
- Corrected handling of version number in user agent string
- unused import cleanup
- better type hinting on centroid() function
- Cleaned up cruft from various refactors

## [0.2.4](https://github.com/asfadmin/Discovery-asf_search/compare/v0.0.0...v0.2.4)

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@

[![PyPI version](https://img.shields.io/pypi/v/asf_search.svg)](https://pypi.python.org/pypi/asf_search/)
[![Conda version](https://img.shields.io/conda/vn/conda-forge/asf_search)](https://anaconda.org/conda-forge/asf_search)
[![Conda platforms](https://img.shields.io/conda/pn/conda-forge/asf_search)](https://anaconda.org/conda-forge/asf_search)

[![PyPI pyversions](https://img.shields.io/pypi/pyversions/asf_search.svg)](https://pypi.python.org/pypi/asf_search/)
[![PyPI license](https://img.shields.io/pypi/l/asf_search.svg)](https://pypi.python.org/pypi/asf_search/)

[![CodeFactor](https://www.codefactor.io/repository/github/asfadmin/discovery-asf_search/badge)](https://www.codefactor.io/repository/github/asfadmin/discovery-asf_search)
[![Github workflow](https://github.com/asfadmin/asf_search/actions/workflows/run-pytest.yml/badge.svg)](https://github.com/asfadmin/Discovery-asf_search/actions/workflows/run-pytest.yml)

Python wrapper for the ASF SearchAPI
Expand Down
12 changes: 7 additions & 5 deletions asf_search/search/product.py → asf_search/ASFProduct.py
Original file line number Diff line number Diff line change
@@ -1,18 +1,20 @@
from typing import Iterable
import numpy as np
import json
from ..download import download_url
from collections import UserList

from asf_search.download import download_url


class ASFProduct:
def __init__(self, args):
def __init__(self, args: dict):
self.properties = args['properties']
self.geometry = args['geometry']

def __str__(self):
return json.dumps(self.geojson(), indent=2, sort_keys=True)

def geojson(self):
def geojson(self) -> dict:
return {
'type': 'Feature',
'geometry': self.geometry,
Expand All @@ -34,13 +36,13 @@ def download(self, dir: str, filename: str = None, token: str = None) -> None:

download_url(url=self.properties['url'], dir=dir, filename=filename, token=token)

def stack(self) -> list:
def stack(self) -> UserList:
"""
Builds a baseline stack from this product.
:return: ASFSearchResults(list) of the stack, with the addition of baseline values (temporal, perpendicular) attached to each ASFProduct.
"""
from .baseline_search import stack_from_product
from .search.baseline_search import stack_from_product

return stack_from_product(self)

Expand Down
10 changes: 4 additions & 6 deletions asf_search/search/results.py → asf_search/ASFSearchResults.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,10 @@
from collections import UserList
import json
from .product import ASFProduct

class ASFSearchResults(list):
def __init__(self, results: dict):
super(ASFSearchResults, self).__init__()
for product in results['features']:
self.append(ASFProduct(product))
from asf_search.ASFProduct import ASFProduct


class ASFSearchResults(UserList):
def geojson(self):
return {
'type': 'FeatureCollection',
Expand Down
2 changes: 2 additions & 0 deletions asf_search/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
from importlib.metadata import PackageNotFoundError, version

from .ASFProduct import ASFProduct
from .ASFSearchResults import ASFSearchResults
from .exceptions import *
from .constants import *
from .health import *
Expand Down
3 changes: 2 additions & 1 deletion asf_search/download/download.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@
import urllib.parse
import requests
from importlib.metadata import PackageNotFoundError, version
from ..exceptions import ASFDownloadError

from asf_search.exceptions import ASFDownloadError


def download_url(url: str, dir: str, filename: str = None, token: str = None) -> None:
Expand Down
1 change: 1 addition & 0 deletions asf_search/health/health.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import requests
import json

import asf_search.constants

def health(host: str = None) -> dict:
Expand Down
2 changes: 0 additions & 2 deletions asf_search/search/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,3 @@
from .product_search import product_search
from .geo_search import geo_search
from .baseline_search import stack_from_id
from .product import ASFProduct
from .results import ASFSearchResults
43 changes: 25 additions & 18 deletions asf_search/search/baseline_search.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
from typing import Iterable
import numpy as np
from .search import search
from .results import ASFSearchResults
from .product import ASFProduct
from .product_search import product_search
from ..constants import INTERNAL, PLATFORM
from ..exceptions import ASFSearchError, ASFBaselineError
from dateutil.parser import parse
import pytz

from asf_search.search import search
from asf_search.ASFSearchResults import ASFSearchResults
from asf_search.ASFProduct import ASFProduct
from asf_search.search.product_search import product_search
from asf_search.constants import INTERNAL, PLATFORM
from asf_search.exceptions import ASFSearchError, ASFBaselineError


precalc_platforms = [
Expand Down Expand Up @@ -35,14 +36,16 @@ def stack_from_product(
"""

stack_params = get_stack_params(reference)
stack_results = search(**stack_params, host=host, cmr_token=cmr_token, cmr_provider=cmr_provider)
calc_temporal_baselines(reference, stack_results)
stack = search(**stack_params, host=host, cmr_token=cmr_token, cmr_provider=cmr_provider)
calc_temporal_baselines(reference, stack)
stack.sort(key=lambda product: product.properties['temporalBaseline'])


#TODO: Calculate temporal baselines
#TODO: Calculate perpendicular baselines
#TODO: Add nearest neighbor finder

return stack_results
return stack


def stack_from_id(
Expand All @@ -68,12 +71,9 @@ def stack_from_id(
cmr_token=cmr_token,
cmr_provider=cmr_provider)

try:
if len(reference_results) <= 0:
raise ASFSearchError(f'Reference product not found: {reference_id}')
reference = reference_results[0]
except KeyError as e:
if len(reference_results) <= 0:
raise ASFSearchError(f'Reference product not found: {reference_id}')
reference = reference_results[0]

return stack_from_product(reference, host=host, cmr_token=cmr_token, cmr_provider=cmr_provider)

Expand Down Expand Up @@ -116,6 +116,13 @@ def calc_temporal_baselines(reference: ASFProduct, stack: ASFSearchResults) -> N
:param stack: The stack to operate on.
:return: None, as the operation occurs in-place on the stack provided.
"""
pass

reference_time = parse(reference.properties['startTime'])
if reference_time.tzinfo is None:
reference_time = pytz.utc.localize(reference_time)

for secondary in stack:
secondary_time = parse(secondary.properties['startTime'])
if secondary_time.tzinfo is None:
secondary_time = pytz.utc.localize(secondary_time)
secondary.properties['temporalBaseline'] = (secondary_time - reference_time).days

7 changes: 4 additions & 3 deletions asf_search/search/geo_search.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
from typing import Union, Iterable
import datetime
from .search import search
from .results import ASFSearchResults
from ..constants import INTERNAL

from asf_search.search import search
from asf_search.ASFSearchResults import ASFSearchResults
from asf_search.constants import INTERNAL


def geo_search(
Expand Down
7 changes: 4 additions & 3 deletions asf_search/search/granule_search.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
from typing import Iterable
from .search import search
from .results import ASFSearchResults
from ..constants import INTERNAL

from asf_search.search import search
from asf_search.ASFSearchResults import ASFSearchResults
from asf_search.constants import INTERNAL


def granule_search(
Expand Down
7 changes: 4 additions & 3 deletions asf_search/search/product_search.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
from typing import Iterable
from .search import search
from .results import ASFSearchResults
from ..constants import INTERNAL

from asf_search.search import search
from asf_search.ASFSearchResults import ASFSearchResults
from asf_search.constants import INTERNAL


def product_search(
Expand Down
31 changes: 25 additions & 6 deletions asf_search/search/search.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,32 +3,40 @@
from requests.exceptions import HTTPError
import datetime
import math
from .results import ASFSearchResults
from ..exceptions import ASFSearch4xxError, ASFSearch5xxError, ASFServerError
from ..constants import INTERNAL
from importlib.metadata import PackageNotFoundError, version

from asf_search.ASFSearchResults import ASFSearchResults, ASFProduct
from asf_search.exceptions import ASFSearch4xxError, ASFSearch5xxError, ASFServerError
from asf_search.constants import INTERNAL


def search(
absoluteOrbit: Iterable[Union[int, Tuple[int, int]]] = None,
asfFrame: Iterable[Union[int, Tuple[int, int]]] = None,
beamMode: Iterable[str] = None,
collectionName: Iterable[str] = None,
maxDoppler: float = None,
minDoppler: float = None,
end: Union[datetime.datetime, str] = None,
flightDirection: Iterable[str] = None,
maxFaradayRotation: float = None,
minFaradayRotation: float = None,
flightDirection: str = None,
flightLine: str = None,
frame: Iterable[Union[int, Tuple[int, int]]] = None,
granule_list: Iterable[str] = None,
groupID: Iterable[str] = None,
insarStackId: str = None,
instrument: Iterable[str] = None,
intersectsWith: str = None,
lookDirection: Iterable[str] = None,
offNadirAngle: Iterable[Union[float, Tuple[float, float]]] = None,
platform: Iterable[str] = None,
polarization: Iterable[str] = None,
processingDate: Union[datetime.datetime, str] = None,
processingLevel: Iterable[str] = None,
product_list: Iterable[str] = None,
relativeOrbit: Iterable[Union[int, Tuple[int, int]]] = None,
season: Tuple[int, int] = None,
start: Union[datetime.datetime, str] = None,
maxResults: int = None,
host: str = INTERNAL.HOST,
Expand All @@ -42,21 +50,28 @@ def search(
:param asfFrame: This is primarily an ASF / JAXA frame reference. However, some platforms use other conventions. See ‘frame’ for ESA-centric frame searches.
:param beamMode: The beam mode used to acquire the data.
:param collectionName: For UAVSAR and AIRSAR data collections only. Search by general location, site description, or data grouping as supplied by flight agency or project.
:param maxDoppler: Doppler provides an indication of how much the look direction deviates from the ideal perpendicular flight direction acquisition.
:param minDoppler: Doppler provides an indication of how much the look direction deviates from the ideal perpendicular flight direction acquisition.
:param end: End date of data acquisition. Supports timestamps as well as natural language such as "3 weeks ago"
:param maxFaradayRotation: Rotation of the polarization plane of the radar signal impacts imagery, as HH and HV signals become mixed.
:param minFaradayRotation: Rotation of the polarization plane of the radar signal impacts imagery, as HH and HV signals become mixed.
:param flightDirection: Satellite orbit direction during data acquisition
:param flightLine: Specify a flightline for UAVSAR or AIRSAR.
:param frame: ESA-referenced frames are offered to give users a universal framing convention. Each ESA frame has a corresponding ASF frame assigned. See also: asfframe
:param granule_list: List of specific granules. Search results may include several products per granule name.
:param groupID: Identifier used to find products considered to be of the same scene but having different granule names
:param insarStackId: Identifier used to find products of the same InSAR stack
:param instrument: The instrument used to acquire the data. See also: platform
:param intersectsWith: Search by polygon, linestring, or point defined in 2D Well-Known Text (WKT)
:param lookDirection: Left or right look direction during data acquisition
:param offNadirAngle: Off-nadir angles for ALOS PALSAR
:param platform: Remote sensing platform that acquired the data. Platforms that work together, such as Sentinel-1A/1B and ERS-1/2 have multi-platform aliases available. See also: instrument
:param polarization: A property of SAR electromagnetic waves that can be used to extract meaningful information about surface properties of the earth.
:param processingDate: Used to find data that has been processed at ASF since a given time and date. Supports timestamps as well as natural language such as "3 weeks ago"
:param processingLevel: Level to which the data has been processed
:param product_list: List of specific products. Guaranteed to be at most one product per product name.
:param relativeOrbit: Path or track of satellite during data acquisition. For UAVSAR it is the Line ID.
:param season: Start and end day of year for desired seasonal range. This option is used in conjunction with start/end to specify a seasonal range within an overall date range.
:param start: Start date of data acquisition. Supports timestamps as well as natural language such as "3 weeks ago"
:param maxResults: The maximum number of results to be returned by the search
:param host: SearchAPI host, defaults to Production SearchAPI. This option is intended for dev/test purposes.
Expand All @@ -65,6 +80,8 @@ def search(
:return: ASFSearchResults(list) of search results
"""
#TODO: Add more params now that ranges are refigured
#TODO: Make sure Ziyi's search case is covered

kwargs = locals()
data = dict((k,v) for k,v in kwargs.items() if v is not None and v != '')
Expand All @@ -74,6 +91,7 @@ def search(
'absoluteOrbit',
'asfFrame',
'frame',
'offNadirAngle',
'relativeOrbit']
for key in flatten_fields:
if key in data:
Expand Down Expand Up @@ -112,9 +130,10 @@ def search(
raise ASFSearch4xxError(f'HTTP {response.status_code}: {response.json()["error"]["report"]}')
if 500 <= response.status_code <= 599:
raise ASFSearch5xxError(f'HTTP {response.status_code}: {response.json()["error"]["report"]}')
raise ASFServerError
raise ASFServerError(f'HTTP {response.status_code}: {response.json()["error"]["report"]}')

return ASFSearchResults(response.json())
products = [ASFProduct(f) for f in response.json()['features']]
return ASFSearchResults(products)


def flatten_list(items: Iterable[Union[float, Tuple[float, float]]]) -> str:
Expand Down
Loading

0 comments on commit e7bfe5f

Please sign in to comment.