From 98ab0661316e64b1566abe7b0a1e8d2e1d089dfd Mon Sep 17 00:00:00 2001 From: David LeBauer Date: Sat, 29 Aug 2020 23:55:01 -0700 Subject: [PATCH 1/8] Delete HOW_TO.md --- HOW_TO.md | 129 ------------------------------------------------------ 1 file changed, 129 deletions(-) delete mode 100644 HOW_TO.md diff --git a/HOW_TO.md b/HOW_TO.md deleted file mode 100644 index a4fe949..0000000 --- a/HOW_TO.md +++ /dev/null @@ -1,129 +0,0 @@ -# How To Use This Template -This document describes how to use this transformer template for your custom algorithm processing plot-level RGB data. - -## Assumptions -It is assumed that: -- you are generating a Docker image containing your algorithm and that you have Docker installed on your computer -- are familiar with GitHub template repositories, or know how to use `git` - -## Steps to take -The following steps can be taken to develop your algorithm for inclusion into a processing pipeline. - -1. [Setup](#setup): Click the `Use this template` button in GitHub to make a copy of this repository (or run `git clone`) -2. [Definitions](#definitions): Fill in and modify the definitions in the algorithm_rgb.py file -3. [Algorithm](#algorithm): Replace the code in the `calculate` function with your algorithm -4. [Test](#test): Run the `testing.py` script to run your algorithm and validate the results -5. [Generate](#generate): Run `generate.py` to create a Dockerfile -6. [Docker](#build_docker): Create a Docker image for your algorithm and publish it -7. [Finishing](#finishing): Finish up your development efforts - -### Setup your repo -The first thing to do is to create a copy of this repository has a meaningful name and that you are able to modify. -In GitHub this is easy, browse to this [repository](https://github.com/AgPipeline/template-rgb-plot) and click the `Use this template` button. -You will be led through the steps necessary to create a clone in a location of your choosing. - -If you are not on GitHub, you will need to setup your `git` environment and clone the repository. - -### Fill in your definitions -To fill in the needed definitions, first open the `algorithm_rgb.py` file in your favorite editor. - -If you are modifying your existing code, you should consider updating the version number definition: `VERSION`. -It's assumed that [Semantic Version numbers](https://semver.org/) will be used, but any methodology can be used. - -Fill in the algorithm definitions with the creator(s) of the algorithm: `ALGORITHM_AUTHOR`, `ALGORITHM_AUTHOR_EMAIL`, `ALGORITHM_NAME`, and `ALGORITHM_DESCRIPTION`. -Multiple names for `ALGORITHM_AUTHOR` and multiple emails for `ALGORITHM_AUTHOR_EMAIL` are supported. -It's best if only one algorithm name is used, but call it what you want. -The safest algorithm naming convention to use is to convert any white-space or other characters to periods (.) which allows different systems to more-easily change the name, if needed. - -Next fill in the citation information that will be used in the generated CSV file: `CITATION_AUTHOR`, `CITATION_TITLE`, and `CITATION_YEAR`. -Be sure to enter the citation information accurately since some systems may expect exact matches. - -The names of the variables are used to determine the number of returned values your algorithm produces: `VARIABLE_NAMES`. -Enter each variable name for each returned value, in the order they are returned, separated by a comma. -Be sure to enter them accurately since some systems may expect exact matches. -It is considered an error to have a mismatch between the number of variables names and the number of returned values. - -A CSV file suitable for ingestion to [BETYdb](https://www.betydb.org/) is generated depending upon the value of the `WRITE_BETYDB_CSV` variable. -Setting this value to `False` will suppress the generation of this file by default. - -A CSV file suitable for ingestion to [TERRA REF Geostreams](https://docs.terraref.org/user-manual/data-products/environmental-conditions) is generated depending upon the value of the `WRITE_GEOSTREAMS_CSV` variable. -Setting this value to `False` will suppress the generation of this file by default. - -Be sure to save your changes. - -### Add your algorithm -Open the `algorithm_rgb.py` file in your favorite editor, if it isn't opened already. - -Scroll to the bottom of the file to the function named `calculate`. - -Replace the comment starting with `# ALGORITHM` and the line below with your calculation(s). -As needed, change the name of array used in your algorithm to the function's parameter `pxarray`. - -Once you have your algorithm in place, replace the comment starting with `# RETURN` and the line below with your return values. -Remember to order your return values to match the declared names in the `VARIABLE_NAMES` definition. - -Modify the rest of the file as necessary if there are additional import statements, functions, classes, and other code needed by your algorithm. - -Be sure to save your changes. - -### Test your algorithm -A testing script named `testing.py` is provided for testing your algorithm. -What isn't provided in the template repository are the plot-level RGB images to test against. -It's expected that you will either provide the images or use a standard set that can be downloaded from [Google Drive](https://drive.google.com/file/d/1xWRU0YgK3Y9aUy5TdRxj14gmjLlozGxo/view?usp=sharing). - -The testing script requires `numpy` and `gdal` to be installed on the testing system. - -The testing script expects to have either a list of source plot image files, or a folder name, or both specified on the command line. - -For example, if your files reside in `/user/myself/test_images` the command to test could be the following: -```./testing.py /user/myself/test_images``` - -### Generate the docker build command file -Now that you have created your algorithm and tested it out to your satisfaction, it's time to make a Docker image so that it can run as part of a workflow. - -To assist in this effort we've provided a script named `generate.py` to produce a file containing the Docker commands needed. -Running this script will not only produce a Docker command file, named `Dockerfile` but also two other files that can be used to install additional dependencies your algorithm needs. -These two other files are named `requirements.txt` for additional Python modules and `packages.txt` for other dependencies. - -To generate these files, just run `generate.py`. - -If your algorithm has additional python module dependencies, edit `requirements.txt` and add the names of the modules. -The listed modules will then be installed as part of the Docker build process. - -If there are other dependencies needed by your algorithm, add them to the `packages.txt` file. -The packages listed will be installed using `apt-get` as part of the Docker build process. - -### Create the Docker image -Now that you have generated your `Dockerfile` and specified any Python modules and other packages needed by your algorithm, you are ready to create a Docker image of your algorithm. - -A sample Docker build command could be: ```docker build -t my_algorithm:latest ./``` -Please refer to the Docker documentation for additional information on building a docker image. - -Once the image is built, you can run it locally or push it to an image repository, such as [DockerHub](https://hub.docker.com/). -Please note that there may be naming requirements for pushing images to a repository. - -**Testing the Docker image** -Using the same image setup as used when [testing your algorithm](#test), a sample command line to run the image could be: -```docker run --rm --mount "src=/user/myself,target=/mnt,type=bind" my_algorithm:latest --working_space "/mnt" "/mnt/images"``` - -Breaking apart this command line, we have the following pieces: -- `docker run` tells Docker to run an instance of the image (specified later in the command) -- `--rm` tells Docker to remove the container (an image instance) when it's completed -- `--mount "src=/user/myself,target=/mnt,type=bind"` specifies the */user/myself* path is to be made available as */mnt* in the container -- `my_algorithm:latest` is the image to run (the running image is known as a *container*) -- `--working_space "/mnt"` lets the software in the container know where its working disk space is located; files are created here -- `"/mnt/images"` specifies where the plot-level image files are located - -The `--mount` command line parameter is important since it allows the running container to access the local file system. -The container can then load the images from the file system directly, without having to perform any copies. -The parameters after the Docker image name are all relative to the target folder specified with this command line parameter. - -Once the image files have been processed, the resulting CSV file(s) will be located in the folder at `/user/myself` (in this example). - -### Finishing up -Now that you're created your algorithm, there's a few more things to take care of: - -1. Make sure you've checked in your changes into source control; you don't want to lose all that hard work! -2. Update the README.md file, filling out the sections with information on your algorithm; others will want to know so they can use it! -3. Submit any requests to our ticketing system on GitHub: https://github.com/AgPipeline/computing-pipeline/issues/new/choose - From f2109e48505ed043de11c0f4963e09a6650e95be Mon Sep 17 00:00:00 2001 From: David LeBauer Date: Sat, 29 Aug 2020 23:56:15 -0700 Subject: [PATCH 2/8] Delete generate.py --- generate.py | 147 ---------------------------------------------------- 1 file changed, 147 deletions(-) delete mode 100755 generate.py diff --git a/generate.py b/generate.py deleted file mode 100755 index 04b7d2f..0000000 --- a/generate.py +++ /dev/null @@ -1,147 +0,0 @@ -#!/usr/bin/env python3 - -"""Generates files used to create Docker images -""" -import datetime - -import algorithm_rgb - -# Names of empty files to create -EMPTY_FILE_NAMES = ['requirements.txt', 'packages.txt'] - -# The name of the Docker build file -DOCKERFILE_NAME = 'Dockerfile' - -# Template contents of the Docker build file -DOCKERFILE_CONTENTS = [ - 'FROM agpipeline/rgb-plot-base-image:latest', - 'LABEL maintainer="Someone "', - '', - 'COPY requirements.txt packages.txt /home/extractor/', - '', - 'USER root', - '', - 'RUN [ -s /home/extractor/packages.txt ] && \\', - ' (echo "Installing packages" && \\', - ' apt-get update && \\', - ' cat /home/extractor/packages.txt | xargs apt-get install -y --no-install-recommends && \\', - ' rm /home/extractor/packages.txt && \\', - ' apt-get autoremove -y && \\', - ' apt-get clean && \\', - ' rm -rf /var/lib/apt/lists/*) || \\', - ' (echo "No packages to install" && \\', - ' rm /home/extractor/packages.txt)', - '', - 'RUN [ -s /home/extractor/requirements.txt ] && \\', - ' (echo "Install python modules" && \\', - ' python -m pip install -U --no-cache-dir pip && \\', - ' python -m pip install --no-cache-dir setuptools && \\', - ' python -m pip install --no-cache-dir -r /home/extractor/requirements.txt && \\', - ' rm /home/extractor/requirements.txt) || \\', - ' (echo "No python modules to install" && \\', - ' rm /home/extractor/requirements.txt)', - '', - 'USER extractor' - '', - 'COPY algorithm_rgb.py /home/extractor/' -] - -# Required variables in algorithm_rgb -REQUIRED_VARIABLES = [ - 'ALGORITHM_AUTHOR', - 'ALGORITHM_AUTHOR_EMAIL', - 'ALGORITHM_NAME', - 'ALGORITHM_DESCRIPTION', - 'VARIABLE_NAMES' -] - -# Variables in algorithm_rgb that are required to not be empty -REQUIRED_NOT_EMPTY_VARIABLES = [ - 'VARIABLE_NAMES' -] - -# Variables in algorithm_rgb that should be filled in, but aren't required to be -PREFERRED_NOT_EMPTY_VARIABLES = [ - 'ALGORITHM_AUTHOR', - 'ALGORITHM_AUTHOR_EMAIL', - 'ALGORITHM_NAME', - 'ALGORITHM_DESCRIPTION', - 'CITATION_AUTHOR', - 'CITATION_TITLE', - 'CITATION_YEAR' -] - - -def check_environment() -> bool: - """Checks that we have the information we need to generate the files - Returns: - Returns True if everything appears to be OK and False if there's a problem detected - """ - # Check for missing definitions - bad_values = [] - for one_attr in REQUIRED_VARIABLES: - if not hasattr(algorithm_rgb, one_attr): - bad_values.append(one_attr) - if bad_values: - print("The following variables are not globally defined in algorithm_rgb.py: %s" % ', '.join(bad_values)) - print("Please add the variables and try again") - return False - - # Check for empty values - for one_attr in REQUIRED_NOT_EMPTY_VARIABLES: - if not getattr(algorithm_rgb, one_attr, None): - bad_values.append(one_attr) - if bad_values: - print("The following variables are empty in algorithm_rgb.py: %s" % ', '.join(bad_values)) - print("Please assign values to the variables and try again") - return False - - # Warnings - for one_attr in PREFERRED_NOT_EMPTY_VARIABLES: - if not hasattr(algorithm_rgb, one_attr) or not getattr(algorithm_rgb, one_attr, None): - bad_values.append(one_attr) - if bad_values: - print("The following variables are missing or empty when it would be better to have them defined and filled in: %s" % \ - ','.join(bad_values)) - print("Continuing to generate files ...") - - return True - - -def generate_files() -> int: - """Generated files needed to create a Docker image - Return: - Returns an integer representing success; zero indicates success and any other value represents failure. - """ - try: - for one_name in EMPTY_FILE_NAMES: - open(one_name, 'a').close() - except Exception as ex: - print("Exception caught while attempting to create files: %s" % str(ex)) - print("Stopping file generation") - return -1 - - # Create the Dockerfile - try: - with open(DOCKERFILE_NAME, "w") as out_file: - out_file.write('# automatically generated: %s\n' % datetime.datetime.now().isoformat()) - for line in DOCKERFILE_CONTENTS: - if line.startswith('LABEL maintainer='): - out_file.write("LABEL maintainer=\"{0} <{1}>\"\n".format(algorithm_rgb.ALGORITHM_AUTHOR, - algorithm_rgb.ALGORITHM_AUTHOR_EMAIL)) - else: - out_file.write("{0}\n".format(line)) - except Exception as ex: - print("Exception caught while attempting to create Docker build file: %s" % str(ex)) - print("Stopping build file generation") - return -2 - - return 0 - - -# Make the call to generate the files -if __name__ == "__main__": - print('Confirming the environment') - if check_environment(): - print('Configuring files') - generate_files() From 108634ecb7d045b41d6aee1e894a5497c0d2f9ec Mon Sep 17 00:00:00 2001 From: Chris Schnaufer Date: Thu, 3 Sep 2020 14:38:09 -0700 Subject: [PATCH 3/8] Initial docker testing --- .github/workflows/docker_test_check.sh | 3 ++ .github/workflows/testing_docker.yaml | 70 ++++++++++++++++++++++++++ test_data/experiment.yaml | 9 ++++ 3 files changed, 82 insertions(+) create mode 100644 .github/workflows/docker_test_check.sh create mode 100644 .github/workflows/testing_docker.yaml create mode 100755 test_data/experiment.yaml diff --git a/.github/workflows/docker_test_check.sh b/.github/workflows/docker_test_check.sh new file mode 100644 index 0000000..3c0e526 --- /dev/null +++ b/.github/workflows/docker_test_check.sh @@ -0,0 +1,3 @@ +#!/usr/bin/env bash + +# Checks that the docker test run succeeded diff --git a/.github/workflows/testing_docker.yaml b/.github/workflows/testing_docker.yaml new file mode 100644 index 0000000..08cf433 --- /dev/null +++ b/.github/workflows/testing_docker.yaml @@ -0,0 +1,70 @@ +name: Testing Docker image +on: + push: + branches: + - master + - develop + - docker_test_workflow + pull_request: + branches: + - master + - develop + tags: + - v* + +jobs: + docker_testing: + runs-on: ubuntu-latest + name: Running Docker testing + steps: + - name: Fetch source code + uses: actions/checkout@v2 + - name: Create folders + run: | + mkdir ./inputs && chmod 777 ./inputs + mkdir ./outputs && chmod 777 ./outputs + - name: List folder contents + run: | + echo "Current folder" && ls -la + echo "test_data" && ls -l ./test_data + - name: Copy testing data files + run: | + cp "${PWD}/test_data"/* "${PWD}/inputs/" + echo "inputs" && ls -l ./inputs + - name: Folder contents + run: | + echo "Current folder" && ls -l + echo "Inputs folder" && ls -l ./inputs + echo "Outputs folder" && ls -l ./outputs + - name: Build docker image + run: docker build -t greenness_test:latest ./ + - name: Compress docker image + run: docker save greenness_test:latest | gzip -7 -c - > greenness_test_image.tar.gz + - name: Upload docker image + uses: actions/upload-artifact@v2 + with: + name: greenness_test_image + path: greenness_test_image.tar.gz + - name: Folder contents + run: | + echo "Current folder" && ls -l + echo "Inputs folder" && ls -l ./inputs + echo "Outputs folder" && ls -l ./outputs + - name: Run docker test + run: docker run --rm -v "${PWD}/inputs:/inputs" -v "${PWD}/outputs:/outputs" greenness_test:latest --working_space /outputs --metadata /inputs/experiment.yaml /inputs/rgb_1_2_E.tif + - name: Output folder contents + run: echo "Outputs folder" && ls -l ./outputs + - name: Check outputs + run: | + chmod +x "./.github/workflows/docker_test_check.sh" + "./.github/workflows/docker_test_check.sh" + + artifact_cleanup: + runs-on: ubuntu-latest + needs: [docker_testing] + name: Cleanup artifacts upon success + steps: + - name: Remove docker artifact + uses: geekyeggo/delete-artifact@v1 + with: + name: greenness_test_image \ No newline at end of file diff --git a/test_data/experiment.yaml b/test_data/experiment.yaml new file mode 100755 index 0000000..efae358 --- /dev/null +++ b/test_data/experiment.yaml @@ -0,0 +1,9 @@ +%YAML 1.1 +--- +pipeline: + studyName: 'S7_20181011' + season: 'S7_20181011' + germplasmName: Sorghum bicolor + collectingSite: Maricopa + observationTimeStamp: '2018-10-11T13:01:02-08:00' + From b15e2b87b986e9be63c51274914768186630e3ad Mon Sep 17 00:00:00 2001 From: Chris Schnaufer Date: Thu, 3 Sep 2020 15:12:28 -0700 Subject: [PATCH 4/8] Fleshiing out validation --- .github/workflows/docker_test_check.sh | 76 ++++++++++++++++++++++++++ .github/workflows/testing_docker.yaml | 1 + 2 files changed, 77 insertions(+) diff --git a/.github/workflows/docker_test_check.sh b/.github/workflows/docker_test_check.sh index 3c0e526..7b2a32d 100644 --- a/.github/workflows/docker_test_check.sh +++ b/.github/workflows/docker_test_check.sh @@ -1,3 +1,79 @@ #!/usr/bin/env bash # Checks that the docker test run succeeded + +# Define expected results +EXPECTED_FILES=("rgb_plot.csv") +EXPECTED_GREENNESS_VALUES=((2.3 0 20.55 1.6 52.72 -50.42 22.85 10.66 1.01 0.0 0.33)) + +# What folder are we looking in for outputs +if [[ ! "${1}" == "" ]]; then + TARGET_FOLDER="${1}" +else + TARGET_FOLDER="./outputs" +fi + +# What our target file to read is +if [[ ! "${2}" == "" ]]; then + CHECK_FILE="${2}" +else + CHECK_FILE="rgb_plot.csv" +fi +EXPECTED_FILES+=("${CHECK_FILE}") + +# Check if expected files are found +for i in $(seq 0 $(( ${#EXPECTED_FILES[@]} - 1 ))) +do + if [[ ! -f "${TARGET_FOLDER}/${EXPECTED_FILES[$i]}" ]]; then + echo "Expected file ${EXPECTED_FILES[$i]} is missing" + exit 10 + fi +done + +# Check the results of the canopy cover calculation +RESULT_VALUES=(`gawk ' +BEGIN { + FPAT = "([^,]+)|(\"[^\"]+\")" +} +{ + if ($1 != "germplasmName") { # Skipping the header line + printf("(%s %s %s %s %s %s %s %s %s %s %s)\n", $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19) + } +} +END { +} +' "${TARGET_FOLDER}/${CHECK_FILE}"`) + +echo "Result counts: ${#EXPECTED_GREENNESS_VALUES[@]} vs ${#RESULT_VALUES[@]}" +if [[ ${#EXPECTED_GREENNESS_VALUES[@]} != ${#RESULT_VALUES[@]} ]]; then + echo "Number of results found in file (${#RESULT_VALUES[@]}) don't match expected count (${#EXPECTED_GREENNESS_VALUES[@]})" + if [[ ${#RESULT_VALUES[@]} > 0 ]]; then + for i in $(seq 0 $(( ${#RESULT_VALUES[@]} - 1 ))) + do + echo "${i}: ${RESULT_VALUES[$i]}" + done + fi + exit 20 +fi + +#for i in $(seq 0 $(( ${#EXPECTED_GREENNESS_VALUES[@]} - 1 ))) +#do +# # Check that we have the same number values +# CUR_EXPECTED=${EXPECTED_GREENNESS_VALUES[$i]} +# CUR_VALUES=${RESULT_VALUES[$i]} +# if [[ ${#CUR_EXPECTED[@]} != ${#CUR_VALUES[@]} ]]; then +# echo "Row ${i}: Expected ${#CUR_EXPECTED[@]} values and received ${#CUR_VALUES[@]}" +# exit 30 +# fi +# +# # Check each of the values +# for j in $(seq 0 $(( ${#CUR_EXPECTED[@]} - 1 ))) +# do +# if [[ "${CUR_EXPECTED[$j]}" == "${CUR_VALUES[$j]}" ]]; then +# echo "Values for index ${j} match: '${CUR_EXPECTED[$j]}' '${CUR_VALUES[$j]}'" +# else +# echo "Result value for index ${j}: '${CUR_EXPECTED[$j]}' doesn't match expected: '${CUR_VALUES[$j]}'" +# exit 30 +# fi +# done +#done diff --git a/.github/workflows/testing_docker.yaml b/.github/workflows/testing_docker.yaml index 08cf433..4e20d74 100644 --- a/.github/workflows/testing_docker.yaml +++ b/.github/workflows/testing_docker.yaml @@ -56,6 +56,7 @@ jobs: run: echo "Outputs folder" && ls -l ./outputs - name: Check outputs run: | + cat "outputs/rgb_plot.csv" chmod +x "./.github/workflows/docker_test_check.sh" "./.github/workflows/docker_test_check.sh" From 3f78557d7b19ab620256ea4eaabec308a6452b12 Mon Sep 17 00:00:00 2001 From: Chris Schnaufer Date: Thu, 3 Sep 2020 15:23:18 -0700 Subject: [PATCH 5/8] Fixing validation script --- .github/workflows/docker_test_check.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) mode change 100644 => 100755 .github/workflows/docker_test_check.sh diff --git a/.github/workflows/docker_test_check.sh b/.github/workflows/docker_test_check.sh old mode 100644 new mode 100755 index 7b2a32d..b81c12d --- a/.github/workflows/docker_test_check.sh +++ b/.github/workflows/docker_test_check.sh @@ -4,7 +4,7 @@ # Define expected results EXPECTED_FILES=("rgb_plot.csv") -EXPECTED_GREENNESS_VALUES=((2.3 0 20.55 1.6 52.72 -50.42 22.85 10.66 1.01 0.0 0.33)) +EXPECTED_GREENNESS_VALUES=(2.3 0 20.55 1.6 52.72 -50.42 22.85 10.66 1.01 0.0 0.33) # What folder are we looking in for outputs if [[ ! "${1}" == "" ]]; then From cc5ca9eaae7e62d57272c58edbaba3762f0eafd7 Mon Sep 17 00:00:00 2001 From: Chris Schnaufer Date: Thu, 3 Sep 2020 15:35:33 -0700 Subject: [PATCH 6/8] Removing branch from workflow YAML --- .github/workflows/testing_docker.yaml | 1 - 1 file changed, 1 deletion(-) diff --git a/.github/workflows/testing_docker.yaml b/.github/workflows/testing_docker.yaml index 4e20d74..c15cfe8 100644 --- a/.github/workflows/testing_docker.yaml +++ b/.github/workflows/testing_docker.yaml @@ -4,7 +4,6 @@ on: branches: - master - develop - - docker_test_workflow pull_request: branches: - master From ba17ee6720ed4b9296cc877bd85ad32c5f886cf5 Mon Sep 17 00:00:00 2001 From: Chris Schnaufer Date: Wed, 9 Sep 2020 12:34:06 -0700 Subject: [PATCH 7/8] Adding testing documentation --- README.md | 80 +++++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 80 insertions(+) diff --git a/README.md b/README.md index 3cf20d3..da96e27 100644 --- a/README.md +++ b/README.md @@ -65,3 +65,83 @@ Hunt, E. Raymond, Michel Cavigelli, Craig ST Daughtry, James E. Mcmurtrey, and C Hague, T., N. D. Tillett, and H. Wheeler. "Automated crop and weed monitoring in widely spaced cereals." Precision Agriculture 7, no. 1 (2006): 21-32. https://doi.org/10.1007/s11119-005-6787-1 Richardson, Andrew D., Julian P. Jenkins, Bobby H. Braswell, David Y. Hollinger, Scott V. Ollinger, and Marie-Louise Smith. "Use of digital webcam images to track spring green-up in a deciduous broadleaf forest." Oecologia 152, no. 2 (2007): 323-334. https://doi.org/10.1007/s00442-006-0657-z + +### Sample Docker Command line + +Below is a sample command line that shows how the soil mask Docker image could be run. +An explanation of the command line options used follows. +Be sure to read up on the [docker run](https://docs.docker.com/engine/reference/run/) command line for more information. + +```docker run --rm --mount "src=${PWD}/test_data,target=/mnt,type=bind" agdrone/transformer-greenness:1.0 --working_space "/mnt" --metadata "/mnt/experiment.yaml" "/mnt/rgb_1_2_E.tif" ``` + +This example command line assumes the source files are located in the `test_data` folder off the current folder. +The name of the image to run is `agdrone/transformer-greenness:1.0`. + +We are using the same folder for the source files and the output files. +By using multiple `--mount` options, the source and output files can be separated. + +**Docker commands** \ +Everything between 'docker' and the name of the image are docker commands. + +- `run` indicates we want to run an image +- `--rm` automatically delete the image instance after it's run +- `--mount "src=${PWD}/test_data,target=/mnt,type=bind"` mounts the `${PWD}/test_data` folder to the `/mnt` folder of the running image + +We mount the `${PWD}/test_data` folder to the running image to make files available to the software in the image. + +**Image's commands** \ +The command line parameters after the image name are passed to the software inside the image. +Note that the paths provided are relative to the running image (see the --mount option specified above). + +- `--working_space "/mnt"` specifies the folder to use as a workspace +- `--metadata "/mnt/experiment.yaml"` is the name of the source metadata +- `"/mnt/rgb_1_2_E.tif"` is the name of the image to calculate greenness on + +## Acceptance Testing + +There are automated test suites that are run via [GitHub Actions](https://docs.github.com/en/actions). +In this section we provide details on these tests so that they can be run locally as well. + +These tests are run when a [Pull Request](https://docs.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests) or [push](https://docs.github.com/en/github/using-git/pushing-commits-to-a-remote-repository) occurs on the `develop` or `master` branches. +There may be other instances when these tests are automatically run, but these are considered the mandatory events and branches. + +### PyLint and PyTest + +These tests are run against any Python scripts that are in the repository. + +[PyLint](https://www.pylint.org/) is used to both check that Python code conforms to the recommended coding style, and checks for syntax errors. +The default behavior of PyLint is modified by the `pylint.rc` file in the [Organization-info](https://github.com/AgPipeline/Organization-info) repository. +Please also refer to our [Coding Standards](https://github.com/AgPipeline/Organization-info#python) for information on how we use [pylint](https://www.pylint.org/). + +The following command can be used to fetch the `pylint.rc` file: +```bash +wget https://raw.githubusercontent.com/AgPipeline/Organization-info/master/pylint.rc +``` + +Assuming the `pylint.rc` file is in the current folder, the following command can be used against the `algorithm_rgb.py` file: +```bash +# Assumes Python3.7+ is default Python version +python -m pylint --rcfile ./pylint.rc algorithm_rgb.py +``` + +In the `tests` folder there are testing scripts; their supporting files are in the `test_data` folder. +The tests are designed to be run with [Pytest](https://docs.pytest.org/en/stable/). +When running the tests, the root of the repository is expected to be the starting directory. + +The command line for running the tests is as follows: +```bash +# Assumes Python3.7+ is default Python version +python -m pytest -rpP +``` + +If [pytest-cov](https://pytest-cov.readthedocs.io/en/latest/) is installed, it can be used to generate a code coverage report as part of running PyTest. +The code coverage report shows how much of the code has been tested; it doesn't indicate **how well** that code has been tested. +The modified PyTest command line including coverage is: +```bash +# Assumes Python3.7+ is default Python version +python -m pytest --cov=. -rpP +``` + +### Docker Testing + +The Docker testing Workflow replicate the examples in this document to ensure they continue to work. From a9cbf5255617d663c4b20f54899dd74f654ae72d Mon Sep 17 00:00:00 2001 From: Chris Schnaufer Date: Wed, 9 Sep 2020 13:02:43 -0700 Subject: [PATCH 8/8] Added building docker image --- README.md | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index da96e27..7289af2 100644 --- a/README.md +++ b/README.md @@ -66,13 +66,24 @@ Hague, T., N. D. Tillett, and H. Wheeler. "Automated crop and weed monitoring in Richardson, Andrew D., Julian P. Jenkins, Bobby H. Braswell, David Y. Hollinger, Scott V. Ollinger, and Marie-Louise Smith. "Use of digital webcam images to track spring green-up in a deciduous broadleaf forest." Oecologia 152, no. 2 (2007): 323-334. https://doi.org/10.1007/s00442-006-0657-z +## Use + ### Sample Docker Command line +First build the Docker image, using the Dockerfile, and tag it agdrone/transformer-greenness:1.0 . +Read about the [docker build](https://docs.docker.com/engine/reference/commandline/build/) command if needed. + +```bash +docker build -t agdrone/transformer-greenness:1.0 ./ +``` + Below is a sample command line that shows how the soil mask Docker image could be run. An explanation of the command line options used follows. Be sure to read up on the [docker run](https://docs.docker.com/engine/reference/run/) command line for more information. -```docker run --rm --mount "src=${PWD}/test_data,target=/mnt,type=bind" agdrone/transformer-greenness:1.0 --working_space "/mnt" --metadata "/mnt/experiment.yaml" "/mnt/rgb_1_2_E.tif" ``` +```bash +docker run --rm --mount "src=${PWD}/test_data,target=/mnt,type=bind" agdrone/transformer-greenness:1.0 --working_space "/mnt" --metadata "/mnt/experiment.yaml" "/mnt/rgb_1_2_E.tif" +``` This example command line assumes the source files are located in the `test_data` folder off the current folder. The name of the image to run is `agdrone/transformer-greenness:1.0`.