This repo contains the code to run experiments using a deep neural network with an ordinal structure in the output layer and the regularised cross-entropy loss based on the beta distribution.
- Stick-breaking output structure.
- Poisson regularisation for cross-entropy loss function.
- Binomial regularisation for cross-entropy loss function.
- Exponential regularisation for cross-entropy loss function.
- Beta regularisation for cross-entropy loss function.
This repo basically requires:
- Python (>= 3.6.8)
- Keras (==2.2.4)
- numpy (>=1.17.2)
- opencv-python (>=4.1.2)
- pandas (>=0.23.4)
- scikit-image (>=0.15.0)
- scikit-learn (>=0.21.3)
- tensorflow (==1.13.1)
To install the requirements, use:
pip install -r requirements.txt
Contributions are welcome. Pull requests are encouraged to be formatted according to PEP8, e.g., using yapf.
You can run all the experiments using the file run.sh.
The paper titled "Unimodal regularisation based on beta distribution for ordinal regression with Deep Learning" has been submitted to Pattern Recognition.
- Víctor Manuel Vargas (@victormvy)
- Pedro Antonio Gutiérrez (@pagutierrez)
- César Hervás-Martínez ([email protected])