Skip to content

Commit

Permalink
Merge pull request #2 from charlesll/v2.1.0
Browse files Browse the repository at this point in the history
V2.1.0
  • Loading branch information
charlesll authored Jun 17, 2024
2 parents 3267056 + 12dd93d commit 3d9f8fd
Show file tree
Hide file tree
Showing 110 changed files with 6,260 additions and 30,055 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,5 @@ figures/dev/
dev_*
compressed_files.tar.gz
src/bkg
build
imelt.egg-info
445 changes: 0 additions & 445 deletions Example_Prediction_OneComposition.ipynb

This file was deleted.

319 changes: 0 additions & 319 deletions Example_entropy_calculations.ipynb

This file was deleted.

446 changes: 0 additions & 446 deletions Example_prediction_SelectedCompositions.ipynb

This file was deleted.

282 changes: 0 additions & 282 deletions Example_prediction_ternary.ipynb

This file was deleted.

2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
MIT License

Copyright (c) 2021-2023 Charles Le Losq, Barbara Baldoni, Andrew Valentine
Copyright (c) 2021-2024 Charles Le Losq, Barbara Baldoni, Andrew Valentine

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
9 changes: 8 additions & 1 deletion NEWS.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,16 @@
# i-Melt

(c) 2021-2023 Charles Le Losq and co., [email protected]
(c) 2021-2024 Charles Le Losq and co., [email protected]

## NEWS

### V2.1.0

- i-Melt is now a Python package : install it using 'pip install imelt' !
- new examples, see the new examples folder in the repository
- the database has been moved in the folder ./src/imelt/data
- functions to simplify queries are available: generate_query_single and generate_query_range

### V2.0.1

- update of the Results_paper.ipynb notebook following the reviews of the manuscript.
Expand Down
Binary file added dist/imelt-2.1.0-py3-none-any.whl
Binary file not shown.
Binary file added dist/imelt-2.1.0.tar.gz
Binary file not shown.
Binary file modified docs/build/doctrees/data.doctree
Binary file not shown.
Binary file modified docs/build/doctrees/environment.pickle
Binary file not shown.
Binary file modified docs/build/doctrees/index.doctree
Binary file not shown.
Binary file modified docs/build/doctrees/installation.doctree
Binary file not shown.
Binary file modified docs/build/doctrees/predictions.doctree
Binary file not shown.
Binary file modified docs/build/doctrees/references.doctree
Binary file not shown.
Binary file removed docs/build/doctrees/result_analysis.doctree
Binary file not shown.
Binary file modified docs/build/doctrees/training.doctree
Binary file not shown.
Binary file removed docs/build/html/.doctrees/data.doctree
Binary file not shown.
Binary file removed docs/build/html/.doctrees/environment.pickle
Binary file not shown.
Binary file removed docs/build/html/.doctrees/index.doctree
Binary file not shown.
Binary file removed docs/build/html/.doctrees/installation.doctree
Binary file not shown.
Binary file removed docs/build/html/.doctrees/predictions.doctree
Binary file not shown.
Binary file removed docs/build/html/.doctrees/references.doctree
Binary file not shown.
Binary file removed docs/build/html/.doctrees/result_analysis.doctree
Binary file not shown.
Binary file removed docs/build/html/.doctrees/training.doctree
Binary file not shown.
6 changes: 3 additions & 3 deletions docs/build/html/_sources/data.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,14 @@ Database
Database localisation
---------------------

All data are given in a `Database.xlsx <https://github.com/charlesll/i-melt/blob/master/data/Database.xlsx>`_ file in the `data folder <https://github.com/charlesll/i-melt/tree/master/data>`_.
All data are given in a `Database.xlsx <https://github.com/charlesll/i-melt/blob/master/data/Database.xlsx>`_ file in the `/src/imelt/data folder <https://github.com/charlesll/i-melt/tree/master/data>`_.

The data used for training the currently provided mdels are in HDF5 format in the data folder.
The data used for training the currently provided models are in HDF5 format in the data folder. They are also provided with the library in /src/imelt/data/.

Data preparation
----------------

Run the script `Dataset_preparation.py <https://github.com/charlesll/i-melt/blob/master/src/Dataset_preparation.py>`_ to prepare the datasets, which are subsequently saved in HDF5 format in the data folder.
The script `Dataset_preparation.py <https://github.com/charlesll/i-melt/blob/master/src/Dataset_preparation.py>`_ allows preparing the datasets, which are subsequently saved in HDF5 format in the data folder.

The `Dataset_visualization.py <https://github.com/charlesll/i-melt/blob/master/src/Dataset_visualization.py>`_ script allows running the generation of several figures, saved in /figures/datasets/

Expand Down
3 changes: 2 additions & 1 deletion docs/build/html/_sources/index.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,9 @@ The project is hosted on `Github <https://github.com/charlesll/i-melt>`_, a `Str
installation
data
training
result_analysis
predictions
tutorials
bugs
references

Indices and tables
Expand Down
18 changes: 12 additions & 6 deletions docs/build/html/_sources/installation.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,20 +4,26 @@ Installation
General preparation
-------------------


i-Melt runs with a traditional Python stack.

If you are not familiar with Python, you can first have a look at the `scipy lecture notes <https://scipy-lectures.org/>`_,
a set of tutorials for the beginner.

You can install `Anaconda Python <https://www.anaconda.com/products/individual>`_ to get a running Python distribution. See the documentation of Anaconda for those steps.

Installation of libraries for i-Melt
------------------------------------
Installation of i-Melt
----------------------

i-Melt is now a Python package. It supports Python 3.8 or higher. Install it using pip:

`$ pip install imelt`

The necessary libraries are listed in the requirements.txt file.
There may be a problem with the installation of aws-fortuna.

Scripts for building, training models and providing useful functions are contained in the `src <https://github.com/charlesll/i-melt/blob/master/src/>`_ folder.
Apparently there is problems with the versions of jax and flax for this package. I reported it but this is still ongoing.
For now, an easy fix is to install aws-fortuna, and then to upgrade jax and flax to the latest versions:

Starting from a barebone Python 3 environment with pip installed, simply open a terminal pointing to the working folder and type::
`$ pip install --upgrade jax flax`

$ pip install --user -r requirements.txt
There will be a version error message, but disregard it, aws-fortuna works and there is no problem anymore.
78 changes: 57 additions & 21 deletions docs/build/html/_sources/predictions.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,8 @@ The easiest way to try i-Melt is to use the `web calculator <https://i-melt.stre
Jupyter notebooks
-----------------

More control can be achieved using directly the i-melt library.
More control can be achieved using directly the i-melt Python library. Have a look at the :doc:`tutorials` page.

Several example notebooks are provided in the main repository. We invite you to have a look at them directly.

If you want to have an example of use for making predictions for a given composition, have a
look at the `Example_Prediction_OneComposition.ipynb <https://share.streamlit.io/charlesll/i-melt/Example_Prediction_OneComposition.ipynb>`_ notebook.

The steps are simple. First, import the necessary libraries and imelt:

Expand All @@ -26,32 +22,72 @@ The steps are simple. First, import the necessary libraries and imelt:
import numpy as np # for arrays
import pandas as pd # for dataframes
import matplotlib.pyplot as plt # for plotting
import src.imelt as imelt # imelt core functions
import src.utils as utils # utility functions
import imelt
Then, we can load the pre-trained i-melt models as one Python object:

.. code-block:: python
imelt_model = imelt.load_pretrained_bagged()
Now we can define a composition of interest in a Panda dataframe.
We also automatically add descriptors to the composition.
Now we can define a composition of interest using the `generate_query_single` function:
It does everything automatically for us, including the addition of descriptors.

.. code-block:: python
composition = imelt.generate_query_single(sio2 = 75.0,
al2o3 = 12.5,
na2o = 12.5,
k2o = 0.0,
mgo = 0.0,
cao = 0.0,
composition_mole=True)
To get predictions from the model `imelt_model`, we use its `predict` method:

.. code-block:: python
prediction = imelt_model.predict("property", composition)
where "property" is a string indicating the property you want, composition is a compositional array that has been put in good shape (simply use the `generate_query_single` or `generate_query_range` to do so). Optional arguments are the temperature T and lbd, the optical wavelength at which you want the optical refractive index if that is what you are after.

Here is a list of the "property" available:

- melt viscosity (log10 Pa s) using Adam-Gibbs: enter "ag"
- melt viscosity (log10 Pa s) using Vogel-Tammann-Fulcher: enter "tvf"
- melt viscosity (log10 Pa s) using Free Volume: enter "cg"
- melt viscosity (log10 Pa s) using Avramov-Milchev: enter "am"
- melt viscosity (log10 Pa s) using MYEGA: enter "myega"
- melt fragility : enter "fragility"
- melt liquidus (K) : enter "liquidus"
- glass transition temperature (K): enter "tg"
- glass configurational entropy (J/mol/K) : enter "sctg"
- glass density (in K): enter "density_glass"
- glass elastic modulus (GPa) : enter "elastic_modulus"
- glass coefficient of thermal expansion : enter "cte"
- glass Abbe number : enter "abbe"
- glass optical refractive index : enter "sellmeier"
- glass Raman spectrum : enter "raman_pred"

Note that for the melt viscosity you must provide a vector of temperature, and for the glass optical refractive you must provide a vector of wavelength.

For instance, if you want predictions for melt viscosity between 1000 and 3000 K with a step of 1 K, you will do

.. code-block:: python
composition = [0.75, # SiO2
0.125, # Al2O3
0.125, # Na2O
0.0, # K2O
0.0, # MgO
0.0] # CaO
T_range = np.arange(1000.0, 3000.0, 1.0)
viscosity = imelt_model.predict("vft", composition, T_range)
# we transform composition in a dataframe and add descriptors
composition = pd.DataFrame(np.array(composition).reshape(1,6), columns=["sio2","al2o3","na2o","k2o","mgo","cao"])
composition = utils.descriptors(composition.loc[:,["sio2","al2o3","na2o","k2o","mgo","cao"]]).values
To get the glass optical refractive index at 589 nm, you will do:

WARNING : lambda is provided in microns !

.. code-block:: python
lbd = np.array([589.0*1e-3]) # warning: enter wavenumber in microns
ri = imelt_model.predict("sellmeier", composition, lbd=lbd)
Predictions can be obtained for Tg with:
And for a property such as Tg, you can do:

.. code-block:: python
Expand Down Expand Up @@ -82,8 +118,8 @@ We can predict the viscosity with the Vogel-Tammann-Fulscher equation. First, we
.. code-block:: python
T_range = np.arange(600, 1500, 1.0) # from 600 to 1500 K with 1 K steps
viscosity = imelt_model.predict("tvf",composition*np.ones((len(T_range),39)),T_range.reshape(-1,1))
viscosity = imelt_model.predict("tvf", composition, T_range)
In the above code note that the composition array has to be modified so that you have as many lines as you have temperatures to predict.

Many other predictions are possible, look at the Jupyter notebooks for more details and examples.
Many other predictions are possible, look at the :doc:`tutorials` for more details and examples.
4 changes: 3 additions & 1 deletion docs/build/html/_sources/references.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,6 @@ https://medium.com/pytorch/from-windows-to-volcanoes-how-pytorch-is-helping-us-u

`Le Losq C., Baldoni B., Valentine A. P., 2023. charlesll/i-melt: i-Melt v2.0.0 (v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.7858297 <https://doi.org/10.5281/zenodo.7858297>`_

`Le Losq C., Badoni B., 2023. Machine learning modeling of the atomic structure and physical properties of alkali and alkaline-earth aluminosilicate glasses and melts. arXiv:2304.12123 https://doi.org/10.48550/arXiv.2304.12123 <https://doi.org/10.48550/arXiv.2304.12123>`_
`Le Losq C., Badoni B., 2023. Machine learning modeling of the atomic structure and physical properties of alkali and alkaline-earth aluminosilicate glasses and melts. arXiv:2304.12123 https://doi.org/10.48550/arXiv.2304.12123 <https://doi.org/10.48550/arXiv.2304.12123>`_

`Le Losq, C., Baldoni, B., 2023. Machine learning modeling of the atomic structure and physical properties of alkali and alkaline-earth aluminosilicate glasses and melts. Journal of Non-Crystalline Solids 617, 122481. https://doi.org/10.1016/j.jnoncrysol.2023.122481 <https://doi.org/10.1016/j.jnoncrysol.2023.122481>`_
6 changes: 0 additions & 6 deletions docs/build/html/_sources/result_analysis.rst.txt

This file was deleted.

26 changes: 14 additions & 12 deletions docs/build/html/_sources/training.rst.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,16 @@
Training the networks
Training
=====================

The easiest way of training one or multiple neural networks is to use the scripts that are provided in /src.
The i-Melt 2.1 library is meant to provide trained models and use them for predictions.

However, if you want to play with the code and train new models, you can do so following the instructions listed below. Note that paths will probably have to be slightly modified, as the current library is intended to be used for predictions and the code for training has been written prior to the latest "production release".

Gettign the scripts
-------------------

Scripts for building, training models and providing useful functions are provided `here <https://github.com/charlesll/i-melt/blob/master/src/>`_.

The easiest way of training one or multiple neural networks is to use those scripts. I suggest getting a copy of the Github repository and working in it directly, it will simplify things.

Training one network
--------------------
Expand Down Expand Up @@ -29,8 +38,8 @@ If we want to save the model and figures in the directories `./model/candidates/

.. code-block:: python
utils.create_dir('./model/candidates/')
utils.create_dir('./figures/single/')
imelt.create_dir('./model/candidates/')
imelt.create_dir('./figures/single/')
Now we need a name for our model, we can generate it with the hyperparameters actually, this will help us having automatic names in case we try different architectures:

Expand Down Expand Up @@ -92,20 +101,13 @@ Hyperparameter tuning
RAY TUNE + OPTUNA
^^^^^^^^^^^^^^^^^

In the version 2.0, we rely on `Ray Tune <https://docs.ray.io/en/latest/tune/index.html>`_ and `Optuna <https://optuna.org/>`_ to search for the best models.
In the version 2.0 and above, we rely on `Ray Tune <https://docs.ray.io/en/latest/tune/index.html>`_ and `Optuna <https://optuna.org/>`_ to search for the best models.

The script `ray_opt.py <https://github.com/charlesll/i-melt/blob/master/src/ray_opt.py>`_ allows running a Ray Tune experiment.

The script `ray_select.py <https://github.com/charlesll/i-melt/blob/master/src/ray_select.py>`_ allows selecting the best models
based on posterior analysis of the Ray Tune experiment (all metrics recorded in an Excel spreadsheet that must be provided for model selection).

Bayesian optimization
^^^^^^^^^^^^^^^^^^^^^

CURRENTLY NOT WORKING

The `bayesian_optim.py <https://github.com/charlesll/i-melt/blob/master/src/bayesian_optim.py>`_ script allows performing Bayesian Optimization for hyperparameter selection using AX plateform.

Training candidates
-------------------

Expand Down
Loading

0 comments on commit 3d9f8fd

Please sign in to comment.