Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release v0.17.0 - Isolate isce2 #269

Merged
merged 45 commits into from
Jun 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
41172ff
Adding readme
mfangaritav May 14, 2024
5017ebb
Update README.md
mfangaritav May 14, 2024
ed61adb
Update README.md
mfangaritav May 14, 2024
e158ad9
Update README.md
mfangaritav May 14, 2024
d30abb4
Update README.md
mfangaritav May 15, 2024
be40833
Update README.md
mfangaritav May 15, 2024
6ecd875
remove old docs folder
mfangaritav May 15, 2024
4ad9d51
Update README.md
mfangaritav May 16, 2024
b3c8d45
Update README.md
mfangaritav May 16, 2024
6b9b343
Update README.md
mfangaritav May 16, 2024
3436ce6
Update README.md
mfangaritav May 16, 2024
2d6d5f7
Update README.md
mfangaritav May 16, 2024
b39d35c
Update README.md
mfangaritav May 16, 2024
7a58182
Update README.md
mfangaritav May 16, 2024
ca72105
Update README.md
mfangaritav May 16, 2024
158eda2
Update README.md
mfangaritav May 16, 2024
8777313
Update README.md
mfangaritav May 16, 2024
9ae5631
Update README.md
mfangaritav May 16, 2024
84d2597
Update README.md
jhkennedy May 20, 2024
9da50ce
Merge pull request #257 from mfangaritav/update-readme
mfangaritav May 21, 2024
d44d53a
refactor to isolate isce2 processing
mfangaritav May 28, 2024
4dd2773
Update src/hyp3_autorift/sentinel1_isce2.py
mfangaritav May 28, 2024
4902ac9
Update src/hyp3_autorift/process.py
mfangaritav May 28, 2024
763da8b
Update environment.yml
mfangaritav May 28, 2024
8686fd3
renaming sentinel1_isce2.py to s1_isce2.py
mfangaritav May 28, 2024
a524900
Merge pull request #260 from mfangaritav/develop
mfangaritav May 28, 2024
dfce293
Minor changes
mfangaritav May 28, 2024
ffd7f0d
Merge pull request #261 from mfangaritav/develop
mfangaritav May 28, 2024
b0b8181
Minor changes
mfangaritav May 28, 2024
7f48494
Merge pull request #262 from mfangaritav/develop
mfangaritav May 28, 2024
1551246
Minor changes
mfangaritav May 28, 2024
4f276fa
Merge pull request #263 from mfangaritav/develop
mfangaritav May 28, 2024
3da5d11
Minor changes
mfangaritav May 28, 2024
6ff330d
Merge pull request #264 from mfangaritav/develop
mfangaritav May 28, 2024
53c09ad
Minor changes
mfangaritav May 28, 2024
b073fe7
Merge pull request #265 from mfangaritav/develop
mfangaritav May 28, 2024
30dd6a5
Minor changes
mfangaritav May 28, 2024
e243b54
Merge pull request #266 from mfangaritav/develop
mfangaritav May 28, 2024
2ade4d8
Merge pull request #267 from ASFHyP3/isolate-isce2
mfangaritav May 29, 2024
ed1a78f
fully isolate S1
jhkennedy Jun 3, 2024
0d98b33
Update CHANGELOG.md
jhkennedy Jun 3, 2024
5f72411
Update README.md
jhkennedy Jun 3, 2024
187068c
Update CHANGES.diff
jhkennedy Jun 3, 2024
006fd91
Merge pull request #268 from ASFHyP3/more-refactor
mfangaritav Jun 4, 2024
b0aa1ba
Update CHANGELOG.md
jhkennedy Jun 4, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@ and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).


## [0.17.0]
## Changed
* In preparation for a major update, the Sentinel-1 processing workflow has been isolated to a new `hyp3_autorift.s1_isce2` module.

## [0.16.0]
### Fixed
* `hyp3_autorift` will no longer attempt to crop files with no valid data
Expand Down
186 changes: 184 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,185 @@
# HyP3 autoRIFT

A HyP3 plugin for feature tracking processing with AutoRIFT-ISCE
# HyP3 autoRIFT Plugin

[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.4037015.svg)](https://doi.org/10.5281/zenodo.4037015)

The HyP3-autoRIFT plugin provides a set of workflows for feature tracking processing with the AutoRIFT [autonomous Repeat Image Feature Tracking](https://github.com/nasa-jpl/autoRIFT) (autoRIFT) software package. This plugin is part of the [Alaska Satellite Facility's](https://asf.alaska.edu) larger HyP3 (Hybrid Plugin Processing Pipeline) system, which is a batch processing pipeline designed for on-demand processing of remote sensing data. For more information on HyP3, see the [Background](#background) section.

## Installation

1. Ensure that conda is installed on your system (we recommend using [mambaforge](https://github.com/conda-forge/miniforge#mambaforge) to reduce setup times).
2. Clone the `hyp3-autorift` repository and navigate to the root directory of this project
```bash
git clone https://github.com/ASFHyP3/hyp3-autorift.git
cd hyp3-autorift
```
3. Create and activate your Python environment
```bash

mamba env create -f environment.yml
mamba activate hyp3-autorift
```
4. Finally, install a development version of HyP3 autoRIFT
```bash
python -m pip install -e .
```

## Usage

The HyP3-autoRIFT plugin provides workflows (accessible directly in Python or via a CLI) that can be used to process SAR data or optical data using autoRIFT. HyP3-autoRIFT can process these satellite missions:
* SAR:
* Sentinel-1
* Optical:
* Sentinel-2
* Landsat 4,5,7,8,9

To see all available workflows, run:
```
python -m hyp3_autorift ++help
```

### `hyp3_autorift` workflow

The `hyp3_autorift` workflow is used to get dense feature tracking between two images using autoRIFT. You can run this workflow by selecting the `hyp3_autorift` process:
```
python -m hyp3_autorift ++process hyp3_autorift [WORKFLOW_ARGS]
```
or by using the `hyp3_autorift` console script:
```
hyp3_autorift [WORKFLOW_ARGS]
```
For example:

```
hyp3_autorift \
"S2B_MSIL1C_20200612T150759_N0209_R025_T22WEB_20200612T184700" \
"S2A_MSIL1C_20200627T150921_N0209_R025_T22WEB_20200627T170912"
```

This command will run autorift for a pair of Sentinel-2 images.

> [!IMPORTANT]
> Credentials are necessary to access Landsat and Sentinel-1 data. See the Credentials section for more information.

For all options available to this workflow, see the help documentation:
```
hyp3_autorift --help
```

### Credentials

Depending on the mission being processed, some workflows will need you to provide credentials. Generally, credentials are provided via environment variables, but some may be provided by command-line arguments or via a `.netrc` file.

#### AWS Credentials

To process Landsat images, you must provide AWS credentials because the data is hosted by USGS in a "requester pays" bucket. To provide AWS credentials, you can either use an AWS profile specified in your `~/.aws/credentials` by exporting:
```
export AWS_PROFILE=your-profile
```
or by exporting credential environment variables:
```
export AWS_ACCESS_KEY_ID=your-id
export AWS_SECRET_ACCESS_KEY=your-key
export AWS_SESSION_TOKEN=your-token # optional; for when using temporary credentials
```

For more information, please see: <https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html>

#### NASA Earthdata Login and ESA Copernicus Data Space Ecosystem (CDSE)

To process Sentinel-1 images, you must provide Earthdata Login credentials and ESA Copernicus Data Space Ecosystem (CDSE) credentials in order to download input data.
* If you do not already have an Earthdata account, you can sign up [here](https://urs.earthdata.nasa.gov/home).
* If you do not already have a CDSE account, you can sign up [here](https://dataspace.copernicus.eu).

For Earthdata login and CDSE, you can provide credentials by exporting environment variables:
```
export EARTHDATA_USERNAME=your-edl-username
export EARTHDATA_PASSWORD=your-edl-password
export ESA_USERNAME=your-esa-username
export ESA_PASSWORD=your-esa-password
```
or via your [`~/.netrc` file](https://everything.curl.dev/usingcurl/netrc) which should contain lines like these two:
```
machine urs.earthdata.nasa.gov login your-edl-username password your-edl-password
machine dataspace.copernicus.eu login your-esa-username password your-esa-password
```

> [!TIP]
> Your `~/.netrc` file should only be readable by your user; otherwise, you'll receive a "net access too permissive" error. To fix, run:
> ```
> chmod 0600 ~/.netrc
> ```

### Docker Container

The ultimate goal of this project is to create a docker container that can run autoRIFT workflows within a HyP3 deployment. To run the current version of the project's container, use this command:
```
docker run -it --rm \
-e AWS_ACCESS_KEY_ID=[YOUR_KEY] \
-e AWS_SECRET_ACCESS_KEY=[YOUR_SECRET] \
-e EARTHDATA_USERNAME=[YOUR_USERNAME_HERE] \
-e EARTHDATA_PASSWORD=[YOUR_PASSWORD_HERE] \
-e ESA_USERNAME=[YOUR_USERNAME_HERE] \
-e ESA_PASSWORD=[YOUR_PASSWORD_HERE] \
ghcr.io/asfhyp3/hyp3-autorift:latest \
++process hyp3_autorift \
[WORKFLOW_ARGS]
```

> [!TIP]
> You can use [`docker run --env-file`](https://docs.docker.com/reference/cli/docker/container/run/#env) to capture all the necessary environment variables in a single file.

#### Docker Outputs

To retain hyp3_autorift output files running via Docker there are two recommended approaches:

1. Use a volume mount

Add the `-w /tmp -v ${PWD}:/tmp` flags after `docker run`; `-w` changes the working directory inside the container to `/tmp` and `-v` will mount your current working directory to the `/tmp` location inside the container such that hyp3_autorift outputs are preserved locally. You can replace `${PWD}` with any valid path.

1. Copy outputs to a remote AWS S3 Bucket

Append the `--bucket` and `--bucket-prefix` to [WORKFLOW_ARGS] so that the final output files are uploaded to AWS S3. This also requires that AWS credentials to write to the bucket are available to the running container. For example, to write outputs to a hypothetical bucket `s3://hypothetical-bucket/test-run/`:

```
docker run -it --rm \
-e AWS_ACCESS_KEY_ID=[YOUR_KEY] \
-e AWS_SECRET_ACCESS_KEY=[YOUR_SECRET] \
-e AWS_SESSION_TOKEN=[YOUR_TOKEN] \ # Optional
-e EARTHDATA_USERNAME=[YOUR_USERNAME_HERE] \
-e EARTHDATA_PASSWORD=[YOUR_PASSWORD_HERE] \
-e ESA_USERNAME=[YOUR_USERNAME_HERE] \
-e ESA_PASSWORD=[YOUR_PASSWORD_HERE] \
ghcr.io/asfhyp3/hyp3-autorift:latest \
++process hyp3_autorift \
[WORKFLOW_ARGS] \
--bucket "hypothetical-bucket" \
--bucket-prefix "test-run"
```

## Background
HyP3 is broken into two components: the cloud architecture/API that manages the processing of HyP3 workflows and Docker container plugins that contain scientific workflows that produce new science products from a variety of data sources (see figure below for the full HyP3 architecture).

![Cloud Architecture](images/arch_here.jpg)

The cloud infrastructure-as-code for HyP3 can be found in the main [HyP3 repository](https://github.com/asfhyp3/hyp3)., while this repository contains a plugin that can be used for feature tracking processing with AutoRIFT.

## License
The HyP3-autoRIFT plugin is licensed under the BSD 3-Clause license. See the LICENSE file for more details.

## Code of conduct
We strive to create a welcoming and inclusive community for all contributors to HyP3-autoRIFT. As such, all contributors to this project are expected to adhere to our code of conduct.

Please see `CODE_OF_CONDUCT.md` for the full code of conduct text.

## Contributing
Contributions to the HyP3-autoRIFT plugin are welcome! If you would like to contribute, please submit a pull request on the GitHub repository.

## Contact Us
Want to talk about HyP3-autoRIFT? We would love to hear from you!

Found a bug? Want to request a feature?
[open an issue](https://github.com/ASFHyP3/asf_tools/issues/new)

General questions? Suggestions? Or just want to talk to the team?
[chat with us on gitter](https://gitter.im/ASFHyP3/community)
162 changes: 0 additions & 162 deletions docs/api_example.md

This file was deleted.

Binary file removed docs/imgs/get_jobs_query.png
Binary file not shown.
Binary file removed docs/imgs/get_user_execute.png
Binary file not shown.
Binary file removed docs/imgs/get_user_try.png
Binary file not shown.
Binary file removed docs/imgs/post_jobs_execute.png
Binary file not shown.
Binary file removed docs/imgs/vertex-sign-in.png
Binary file not shown.
Loading
Loading