From 19e4e7f9ff77c8ed3d00806324f6452841d1388e Mon Sep 17 00:00:00 2001 From: MartinuzziFrancesco Date: Tue, 26 Nov 2024 15:11:02 +0100 Subject: [PATCH] readme changes --- README.md | 9 ++++++-- docs/src/esn_tutorials/change_layers.md | 28 ++++++++++++++----------- src/ReservoirComputing.jl | 10 ++++----- 3 files changed, 28 insertions(+), 19 deletions(-) diff --git a/README.md b/README.md index 23b51ad7..82154140 100644 --- a/README.md +++ b/README.md @@ -1,4 +1,8 @@ -# ReservoirComputing.jl +

+ +

+ +
[![Join the chat at https://julialang.zulipchat.com #sciml-bridged](https://img.shields.io/static/v1?label=Zulip&message=chat&color=9558b2&labelColor=389826)](https://julialang.zulipchat.com/#narrow/stream/279055-sciml-bridged) [![Global Docs](https://img.shields.io/badge/docs-SciML-blue.svg)](https://docs.sciml.ai/ReservoirComputing/stable/) @@ -11,8 +15,9 @@ [![ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://img.shields.io/badge/ColPrac-Contributor%27s%20Guide-blueviolet)](https://github.com/SciML/ColPrac) [![SciML Code Style](https://img.shields.io/static/v1?label=code%20style&message=SciML&color=9558b2&labelColor=389826)](https://github.com/SciML/SciMLStyle) -![rc_full_logo_large_white_cropped](https://user-images.githubusercontent.com/10376688/144242116-8243f58a-5ac6-4e0e-88d5-3409f00e20b4.png) +
+# ReservoirComputing.jl ReservoirComputing.jl provides an efficient, modular and easy to use implementation of Reservoir Computing models such as Echo State Networks (ESNs). For information on using this package please refer to the [stable documentation](https://docs.sciml.ai/ReservoirComputing/stable/). Use the [in-development documentation](https://docs.sciml.ai/ReservoirComputing/dev/) to take a look at at not yet released features. ## Quick Example diff --git a/docs/src/esn_tutorials/change_layers.md b/docs/src/esn_tutorials/change_layers.md index 906d693b..f10c869e 100644 --- a/docs/src/esn_tutorials/change_layers.md +++ b/docs/src/esn_tutorials/change_layers.md @@ -7,7 +7,9 @@ weights = init(rng, dims...) #rng is optional weights = init(dims...) ``` + Additional keywords can be added when needed: + ```julia weights_init = init(rng; kwargs...) weights = weights_init(rng, dims...) @@ -32,26 +34,27 @@ predict_len = 2000 ds = Systems.henon() traj, t = trajectory(ds, 7000) data = Matrix(traj)' -data = (data .-0.5) .* 2 +data = (data .- 0.5) .* 2 shift = 200 -training_input = data[:, shift:shift+train_len-1] -training_target = data[:, shift+1:shift+train_len] -testing_input = data[:,shift+train_len:shift+train_len+predict_len-1] -testing_target = data[:,shift+train_len+1:shift+train_len+predict_len] +training_input = data[:, shift:(shift + train_len - 1)] +training_target = data[:, (shift + 1):(shift + train_len)] +testing_input = data[:, (shift + train_len):(shift + train_len + predict_len - 1)] +testing_target = data[:, (shift + train_len + 1):(shift + train_len + predict_len)] ``` + Now it is possible to define the input layers and reservoirs we want to compare and run the comparison in a simple for loop. The accuracy will be tested using the mean squared deviation msd from StatsBase. ```@example minesn using ReservoirComputing, StatsBase res_size = 300 -input_layer = [minimal_init(; weight = 0.85, sampling_type=:irrational), - minimal_init(; weight = 0.95, sampling_type=:irrational)] -reservoirs = [simple_cycle(; weight=0.7), - cycle_jumps(; cycle_weight=0.7, jump_weight=0.2, jump_size=5)] +input_layer = [minimal_init(; weight = 0.85, sampling_type = :irrational), + minimal_init(; weight = 0.95, sampling_type = :irrational)] +reservoirs = [simple_cycle(; weight = 0.7), + cycle_jumps(; cycle_weight = 0.7, jump_weight = 0.2, jump_size = 5)] -for i=1:length(reservoirs) +for i in 1:length(reservoirs) esn = ESN(training_input, 2, res_size; input_layer = input_layer[i], reservoir = reservoirs[i]) @@ -60,9 +63,10 @@ for i=1:length(reservoirs) println(msd(testing_target, output)) end ``` + As it is possible to see, changing layers in ESN models is straightforward. Be sure to check the API documentation for a full list of reservoir and layers. ## Bibliography -[^rodan2012]: Rodan, Ali, and Peter Tiňo. “Simple deterministically constructed cycle reservoirs with regular jumps.” Neural computation 24.7 (2012): 1822-1852. -[^rodan2010]: Rodan, Ali, and Peter Tiňo. “Minimum complexity echo state network.” IEEE transactions on neural networks 22.1 (2010): 131-144. \ No newline at end of file +[^rodan2012]: Rodan, Ali, and Peter Tiňo. “Simple deterministically constructed cycle reservoirs with regular jumps.” Neural computation 24.7 (2012): 1822-1852. +[^rodan2010]: Rodan, Ali, and Peter Tiňo. “Minimum complexity echo state network.” IEEE transactions on neural networks 22.1 (2010): 131-144. diff --git a/src/ReservoirComputing.jl b/src/ReservoirComputing.jl index bd119774..8a9abab5 100644 --- a/src/ReservoirComputing.jl +++ b/src/ReservoirComputing.jl @@ -50,7 +50,8 @@ forecasts by recursively feeding their own outputs back as inputs for subsequent prediction steps. # Parameters -- `prediction_len::Int`: The number of future steps to predict. + + - `prediction_len::Int`: The number of future steps to predict. # Description @@ -63,7 +64,6 @@ At each step, the model takes the current input, generates a prediction, and then incorporates that prediction into the input for the next step. This recursive process continues until the specified number of prediction steps (`prediction_len`) is reached. - """ struct Generative{T} <: AbstractPrediction prediction_len::T @@ -82,8 +82,9 @@ where a model predicts labels based on a provided set of input features (`prediction_data`). # Parameters -- `prediction_data`: The input data used for prediction, typically structured as a matrix - where each column represents a sample, and each row represents a feature. + + - `prediction_data`: The input data used for prediction, typically structured as a matrix + where each column represents a sample, and each row represents a feature. # Description @@ -97,7 +98,6 @@ instead, it operates on fixed input data to produce a single batch of prediction This method is suitable for tasks like classification, regression, or other use cases where the input features and the number of steps are predefined. - """ function Predictive(prediction_data) prediction_len = size(prediction_data, 2)