-
Notifications
You must be signed in to change notification settings - Fork 113
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #206 from CosmiQ/dev
Version bump to 0.1.2
- Loading branch information
Showing
151 changed files
with
3,237 additions
and
354 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -120,3 +120,5 @@ dmypy.json | |
|
||
# Project-specific | ||
sandbox.ipynb | ||
solaris/nets/weights | ||
model_weights |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
<ul class="globaltoc">{{ toctree(collapse=False,includehidden=theme_globaltoc_includehidden|tobool) }}</ul> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,5 @@ | ||
.. _api_index: | ||
|
||
.. title:: API reference contents | ||
|
||
################### | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,5 @@ | ||
.. _installation: | ||
|
||
###################### | ||
Installing ``solaris`` | ||
###################### | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,5 @@ | ||
.. _intro: | ||
|
||
|
||
############################## | ||
An introduction to ``solaris`` | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,62 @@ | ||
.. _pretrained_models: | ||
|
||
########################################## | ||
Pretrained models available in ``solaris`` | ||
########################################## | ||
|
||
``solaris`` provides access to a number of pre-trained models from | ||
`the SpaceNet challenges <https://spacenet.ai>`_. See the table below for a | ||
summary. Note that the model name in the first column should be used as the | ||
``"model_name"`` argument in | ||
`the config file <tutorials/notebooks/creating_the_yaml_config_file.html>`_ if you wish to use that model with ``solaris``. Note that we re-trained the | ||
competitors' models for compatibility with ``solaris`` and the training parameters, | ||
inputs, and performance may vary slightly from their original models. | ||
|
||
Model details | ||
============= | ||
|
||
+------------------------------------+---------------------+-----------------------+----------------+-------------+-------------+---------------------------------+---------------------------------------+ | ||
| Model name | Model type | Model details | # Parameters | Input shape |Output shape | Config file | Weights file | | ||
+====================================+=====================+=======================+================+=============+=============+=================================+=======================================+ | ||
| xdxd_spacenet4 | Segmentation UNet | Encoder: VGG16 | 29.3M | 3x512x512 | 1x512x512 | `link <XDXDconfig_>`_ | `link <XDXDweights_>`_ (117 MB) | | ||
+------------------------------------+---------------------+-----------------------+----------------+-------------+-------------+---------------------------------+---------------------------------------+ | ||
| selimsef_spacenet4_resnet34unet | Segmentation UNet | Encoder: ResNet-34 | 30.0M | 4x416x416 | 3x416x416 | `link <ssresnet34config_>`_ | `link <ssresnet34weights_>`_ (120 MB) | | ||
+------------------------------------+---------------------+-----------------------+----------------+-------------+-------------+---------------------------------+---------------------------------------+ | ||
| selimsef_spacenet4_densenet121unet | Segmentation UNet | Encoder: DenseNet-121 | 15.6M | 3x384x384 | 3x384x384 | `link <ssdense121config_>`_ | `link <ssdense121weights_>`_ (63 MB) | | ||
+------------------------------------+---------------------+-----------------------+----------------+-------------+-------------+---------------------------------+---------------------------------------+ | ||
| selimsef_spacenet4_densenet161unet | Segmentation UNet | Encoder: DenseNet-161 | 41.1M | 3x384x384 | 3x384x384 | `link <ssdense161config_>`_ | `link <ssdense161weights_>`_ (158 MB) | | ||
+------------------------------------+---------------------+-----------------------+----------------+-------------+-------------+---------------------------------+---------------------------------------+ | ||
|
||
Training details | ||
================ | ||
|
||
Below is a summary of the training hyperparameters for each model. For image | ||
pre-processing and augmentation pipelines see the config files linked above. | ||
*Note that our hyperparameters may differ from the competitors' original values.* | ||
See `their solution descriptions <https://github.com/spacenetchallenge>`_ for | ||
more on their implementations. | ||
|
||
+------------------------------------+-------------------------+-------------------+---------------+------------------------+-----------------+------------+-----------------+---------------------+ | ||
| Model name | Loss function | Optimizer | Learning Rate | Training input | Training mask | Batch size | Training Epochs | Pre-trained weights | | ||
+====================================+=========================+===================+===============+========================+=================+============+=================+=====================+ | ||
| xdxd_spacenet4 | BCE + | Adam | 1e-4 | SpaceNet 4 | Footprints only | 12 | 60 | None | | ||
| | Jaccard (4:1) | default params | with decay | Pan-sharpened RGB | | | | | | ||
+------------------------------------+-------------------------+-------------------+---------------+------------------------+-----------------+------------+-----------------+---------------------+ | ||
| selimsef_spacenet4_resnet34unet | Focal + Dice | AdamW | 2e-4 | SpaceNet 4 | 3-channel (FP, | 42 | 70 | ImageNet (encoder | | ||
| | (1:1) | 1e-3 weight decay | with decay | Pan-sharpened RGB+NIR | (edge, contact) | | | only) | | ||
+------------------------------------+-------------------------+-------------------+---------------+------------------------+-----------------+------------+-----------------+---------------------+ | ||
| selimsef_spacenet4_densenet121unet | Focal + Dice | AdamW | 2e-4 | SpaceNet 4 | 3-channel (FP, | 32 | 70 | ImageNet (encoder | | ||
| | (1:1) | 1e-3 weight decay | with decay | Pan-sharpened RGB | (edge, contact) | | | only) | | ||
+------------------------------------+-------------------------+-------------------+---------------+------------------------+-----------------+------------+-----------------+---------------------+ | ||
| selimsef_spacenet4_densenet161unet | Focal + Dice | AdamW | 2e-4 | SpaceNet 4 | 3-channel (FP, | 20 | 60 | ImageNet (encoder | | ||
| | (1:1) | 1e-3 weight decay | with decay | Pan-sharpened RGB | (edge, contact) | | | only) | | ||
+------------------------------------+-------------------------+-------------------+---------------+------------------------+-----------------+------------+-----------------+---------------------+ | ||
|
||
.. _XDXDconfig: https://github.com/CosmiQ/solaris/blob/master/solaris/nets/configs/xdxd_spacenet4.yml | ||
.. _ssresnet34config: https://github.com/CosmiQ/solaris/blob/master/solaris/nets/configs/selimsef_resnet34unet_spacenet4.yml | ||
.. _ssdense121config: https://github.com/CosmiQ/solaris/blob/master/solaris/nets/configs/selimsef_densenet121unet_spacenet4.yml | ||
.. _ssdense161config: https://github.com/CosmiQ/solaris/blob/master/solaris/nets/configs/selimsef_densenet161unet_spacenet4.yml | ||
.. _XDXDweights: https://s3.amazonaws.com/spacenet-dataset/spacenet-model-weights/spacenet-4/xdxd_spacenet4_solaris_weights.pth | ||
.. _ssresnet34weights: https://s3.amazonaws.com/spacenet-dataset/spacenet-model-weights/spacenet-4/selimsef_spacenet4_resnet34unet_solaris_weights.pth | ||
.. _ssdense121weights: https://s3.amazonaws.com/spacenet-dataset/spacenet-model-weights/spacenet-4/selimsef_spacenet4_densenet121unet_solaris_weights.pth | ||
.. _ssdense161weights: https://s3.amazonaws.com/spacenet-dataset/spacenet-model-weights/spacenet-4/selimsef_spacenet4_densenet161unet_solaris_weights.pth |
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,161 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"# Inferencing with included SpaceNet models and the `solaris` Python API\n", | ||
"\n", | ||
"We've included a number of SpaceNet models with `solaris`, including pre-trained model weights. You can find more information about your model choices [here](../pretrained_models.html) and the original competitors' code for the models [here](https://github.com/spacenetchallenge/spacenet_off_nadir_solutions).\n", | ||
"\n", | ||
"For this tutorial we'll walk through running inference with XD_XD's SpaceNet 4 model. We'll use the config file for that model, which you can find [here](https://github.com/CosmiQ/solaris/blob/master/solaris/nets/configs/xdxd_spacenet4.yml).\n", | ||
"\n", | ||
"You'll also need to [create the image reference files](creating_im_reference_csvs.ipynb) before you start.\n", | ||
"\n", | ||
"First, we'll load the configuration:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 2, | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"{'model_name': 'xdxd_spacenet4',\n", | ||
" 'model_path': None,\n", | ||
" 'train': False,\n", | ||
" 'infer': True,\n", | ||
" 'pretrained': True,\n", | ||
" 'nn_framework': 'torch',\n", | ||
" 'batch_size': 12,\n", | ||
" 'data_specs': {'width': 512,\n", | ||
" 'height': 512,\n", | ||
" 'image_type': 'zscore',\n", | ||
" 'rescale': False,\n", | ||
" 'rescale_minima': 'auto',\n", | ||
" 'rescale_maxima': 'auto',\n", | ||
" 'channels': 4,\n", | ||
" 'label_type': 'mask',\n", | ||
" 'is_categorical': False,\n", | ||
" 'mask_channels': 1,\n", | ||
" 'val_holdout_frac': 0.2,\n", | ||
" 'data_workers': None},\n", | ||
" 'training_data_csv': '/path/to/training_df.csv',\n", | ||
" 'validation_data_csv': None,\n", | ||
" 'inference_data_csv': '/path/to/test_df.csv',\n", | ||
" 'training_augmentation': {'augmentations': {'DropChannel': {'idx': 3,\n", | ||
" 'axis': 2},\n", | ||
" 'HorizontalFlip': {'p': 0.5},\n", | ||
" 'RandomRotate90': {'p': 0.5},\n", | ||
" 'RandomCrop': {'height': 512, 'width': 512, 'p': 1.0},\n", | ||
" 'Normalize': {'mean': [0.006479, 0.009328, 0.01123],\n", | ||
" 'std': [0.004986, 0.004964, 0.00495],\n", | ||
" 'max_pixel_value': 65535.0,\n", | ||
" 'p': 1.0}},\n", | ||
" 'p': 1.0,\n", | ||
" 'shuffle': True},\n", | ||
" 'validation_augmentation': {'augmentations': {'DropChannel': {'idx': 3,\n", | ||
" 'axis': 2},\n", | ||
" 'CenterCrop': {'height': 512, 'width': 512, 'p': 1.0},\n", | ||
" 'Normalize': {'mean': [0.006479, 0.009328, 0.01123],\n", | ||
" 'std': [0.004986, 0.004964, 0.00495],\n", | ||
" 'max_pixel_value': 65535.0,\n", | ||
" 'p': 1.0}},\n", | ||
" 'p': 1.0},\n", | ||
" 'inference_augmentation': {'augmentations': {'DropChannel': {'idx': 3,\n", | ||
" 'axis': 2,\n", | ||
" 'p': 1.0},\n", | ||
" 'Normalize': {'mean': [0.006479, 0.009328, 0.01123],\n", | ||
" 'std': [0.004986, 0.004964, 0.00495],\n", | ||
" 'max_pixel_value': 65535.0,\n", | ||
" 'p': 1.0}},\n", | ||
" 'p': 1.0},\n", | ||
" 'training': {'epochs': 60,\n", | ||
" 'steps_per_epoch': None,\n", | ||
" 'optimizer': 'Adam',\n", | ||
" 'lr': 0.0001,\n", | ||
" 'opt_args': None,\n", | ||
" 'loss': {'bcewithlogits': None, 'jaccard': None},\n", | ||
" 'loss_weights': {'bcewithlogits': 10, 'jaccard': 2.5},\n", | ||
" 'metrics': {'training': None, 'validation': None},\n", | ||
" 'checkpoint_frequency': 10,\n", | ||
" 'callbacks': {'model_checkpoint': {'filepath': 'xdxd_best.pth',\n", | ||
" 'monitor': 'val_loss'}},\n", | ||
" 'model_dest_path': 'xdxd.pth',\n", | ||
" 'verbose': True},\n", | ||
" 'inference': {'window_step_size_x': None,\n", | ||
" 'window_step_size_y': None,\n", | ||
" 'output_dir': 'inference_out/'}}" | ||
] | ||
}, | ||
"execution_count": 2, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"import solaris as sol\n", | ||
"\n", | ||
"config = sol.utils.config.parse('/Users/nweir/code/cosmiq_repos/solaris/solaris/nets/configs/xdxd_spacenet4.yml')\n", | ||
"config" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"As you can see, the YAML gets parsed into a set of nested dictionaries by `solaris`. Relevant pieces of that config then get read during training.\n", | ||
"\n", | ||
"Inferencing is _very_ similar to [training](api_training_spacenet.ipynb), with one major difference: you'll load in and pass the reference CSV for your inference dataset as an argument to the `Inferer` object. `solaris` is set up this way so that you can quickly and easily iterate through inference on a number of inputs without having to re-instantiate your inferencer each time." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"inferer = sol.nets.infer.Inferer(config)\n", | ||
"inference_df = sol.nets.infer.get_infer_df(config)\n", | ||
"inferer(inference_df)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"The above commands will create prediction segmentation masks for each input image in the `output_dir` specified in your `config`. You can then [use sol.vector.mask.mask_to_poly_geojson to convert these predicted masks to vector-formatted geometries](api_mask_to_vector.ipynb)." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "solaris", | ||
"language": "python", | ||
"name": "solaris" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.6.7" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 2 | ||
} |
Oops, something went wrong.