Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Small docstring fixes #229

Merged
merged 2 commits into from
Nov 26, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 7 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
# ReservoirComputing.jl
<p align="center">
<img width="400px" src="docs/src/assets/logo.png"/>
</p>

<div align="center">

[![Join the chat at https://julialang.zulipchat.com #sciml-bridged](https://img.shields.io/static/v1?label=Zulip&message=chat&color=9558b2&labelColor=389826)](https://julialang.zulipchat.com/#narrow/stream/279055-sciml-bridged)
[![Global Docs](https://img.shields.io/badge/docs-SciML-blue.svg)](https://docs.sciml.ai/ReservoirComputing/stable/)
Expand All @@ -11,8 +15,9 @@
[![ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://img.shields.io/badge/ColPrac-Contributor%27s%20Guide-blueviolet)](https://github.com/SciML/ColPrac)
[![SciML Code Style](https://img.shields.io/static/v1?label=code%20style&message=SciML&color=9558b2&labelColor=389826)](https://github.com/SciML/SciMLStyle)

![rc_full_logo_large_white_cropped](https://user-images.githubusercontent.com/10376688/144242116-8243f58a-5ac6-4e0e-88d5-3409f00e20b4.png)
</div>

# ReservoirComputing.jl
ReservoirComputing.jl provides an efficient, modular and easy to use implementation of Reservoir Computing models such as Echo State Networks (ESNs). For information on using this package please refer to the [stable documentation](https://docs.sciml.ai/ReservoirComputing/stable/). Use the [in-development documentation](https://docs.sciml.ai/ReservoirComputing/dev/) to take a look at at not yet released features.

## Quick Example
Expand Down
28 changes: 16 additions & 12 deletions docs/src/esn_tutorials/change_layers.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ weights = init(rng, dims...)
#rng is optional
weights = init(dims...)
```

Additional keywords can be added when needed:

```julia
weights_init = init(rng; kwargs...)
weights = weights_init(rng, dims...)
Expand All @@ -32,26 +34,27 @@ predict_len = 2000
ds = Systems.henon()
traj, t = trajectory(ds, 7000)
data = Matrix(traj)'
data = (data .-0.5) .* 2
data = (data .- 0.5) .* 2
shift = 200

training_input = data[:, shift:shift+train_len-1]
training_target = data[:, shift+1:shift+train_len]
testing_input = data[:,shift+train_len:shift+train_len+predict_len-1]
testing_target = data[:,shift+train_len+1:shift+train_len+predict_len]
training_input = data[:, shift:(shift + train_len - 1)]
training_target = data[:, (shift + 1):(shift + train_len)]
testing_input = data[:, (shift + train_len):(shift + train_len + predict_len - 1)]
testing_target = data[:, (shift + train_len + 1):(shift + train_len + predict_len)]
```

Now it is possible to define the input layers and reservoirs we want to compare and run the comparison in a simple for loop. The accuracy will be tested using the mean squared deviation msd from StatsBase.

```@example minesn
using ReservoirComputing, StatsBase

res_size = 300
input_layer = [minimal_init(; weight = 0.85, sampling_type=:irrational),
minimal_init(; weight = 0.95, sampling_type=:irrational)]
reservoirs = [simple_cycle(; weight=0.7),
cycle_jumps(; cycle_weight=0.7, jump_weight=0.2, jump_size=5)]
input_layer = [minimal_init(; weight = 0.85, sampling_type = :irrational),
minimal_init(; weight = 0.95, sampling_type = :irrational)]
reservoirs = [simple_cycle(; weight = 0.7),
cycle_jumps(; cycle_weight = 0.7, jump_weight = 0.2, jump_size = 5)]

for i=1:length(reservoirs)
for i in 1:length(reservoirs)
esn = ESN(training_input, 2, res_size;
input_layer = input_layer[i],
reservoir = reservoirs[i])
Expand All @@ -60,9 +63,10 @@ for i=1:length(reservoirs)
println(msd(testing_target, output))
end
```

As it is possible to see, changing layers in ESN models is straightforward. Be sure to check the API documentation for a full list of reservoir and layers.

## Bibliography
[^rodan2012]: Rodan, Ali, and Peter Tiňo. “Simple deterministically constructed cycle reservoirs with regular jumps.” Neural computation 24.7 (2012): 1822-1852.

[^rodan2010]: Rodan, Ali, and Peter Tiňo. “Minimum complexity echo state network.” IEEE transactions on neural networks 22.1 (2010): 131-144.
[^rodan2012]: Rodan, Ali, and Peter Tiňo. “Simple deterministically constructed cycle reservoirs with regular jumps.” Neural computation 24.7 (2012): 1822-1852.
[^rodan2010]: Rodan, Ali, and Peter Tiňo. “Minimum complexity echo state network.” IEEE transactions on neural networks 22.1 (2010): 131-144.
43 changes: 40 additions & 3 deletions src/ReservoirComputing.jl
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,25 @@ end
"""
Generative(prediction_len)

This prediction methodology allows the models to produce an autonomous prediction, feeding the prediction into itself to generate the next step.
The only parameter needed is the number of steps for the prediction.
A prediction strategy that enables models to generate autonomous multi-step
forecasts by recursively feeding their own outputs back as inputs for
subsequent prediction steps.

# Parameters

- `prediction_len::Int`: The number of future steps to predict.

# Description

The `Generative` prediction method allows a model to perform multi-step
forecasting by using its own previous predictions as inputs for future predictions.
This approach is especially useful in time series analysis, where each prediction
depends on the preceding data points.

At each step, the model takes the current input, generates a prediction,
and then incorporates that prediction into the input for the next step.
This recursive process continues until the specified
number of prediction steps (`prediction_len`) is reached.
"""
struct Generative{T} <: AbstractPrediction
prediction_len::T
Expand All @@ -60,7 +77,27 @@ end
"""
Predictive(prediction_data)

Given a set of labels as `prediction_data`, this method of prediction will return the corresponding labels in a standard Machine Learning fashion.
A prediction strategy for supervised learning tasks,
where a model predicts labels based on a provided set
of input features (`prediction_data`).

# Parameters

- `prediction_data`: The input data used for prediction, typically structured as a matrix
where each column represents a sample, and each row represents a feature.

# Description

The `Predictive` prediction method is a standard approach
in supervised machine learning tasks. It uses the provided input data
(`prediction_data`) to produce corresponding labels or outputs based
on the learned relationships in the model. Unlike generative prediction,
this method does not recursively feed predictions into the model;
instead, it operates on fixed input data to produce a single batch of predictions.

This method is suitable for tasks like classification,
regression, or other use cases where the input features
and the number of steps are predefined.
"""
function Predictive(prediction_data)
prediction_len = size(prediction_data, 2)
Expand Down
9 changes: 0 additions & 9 deletions src/esn/deepesn.jl
Original file line number Diff line number Diff line change
Expand Up @@ -55,11 +55,6 @@ temporal features.
- `matrix_type`: The type of matrix used for storing the training data.
Default is inferred from `train_data`.

# Returns

- A `DeepESN` instance configured according to the provided parameters
and suitable for further training and prediction tasks.

# Example

```julia
Expand All @@ -73,10 +68,6 @@ deepESN = DeepESN(train_data, 10, 100, depth = 3, washout = 100)
train(deepESN, target_data)
prediction = predict(deepESN, new_data)
```

The DeepESN model is ideal for tasks requiring the processing of sequences with
complex temporal dependencies, benefiting from the multiple reservoirs to capture
different levels of abstraction and temporal dynamics.
"""
function DeepESN(train_data,
in_size::Int,
Expand Down
6 changes: 3 additions & 3 deletions src/esn/esn.jl
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@ Creates an Echo State Network (ESN) using specified parameters and training data

- `train_data`: Matrix of training data (columns as time steps, rows as features).
- `variation`: Variation of ESN (default: `Default()`).
- `input_layer`: Input layer of ESN (default: `DenseLayer()`).
- `reservoir`: Reservoir of the ESN (default: `RandSparseReservoir(100)`).
- `bias`: Bias vector for each time step (default: `NullLayer()`).
- `input_layer`: Input layer of ESN.
- `reservoir`: Reservoir of the ESN.
- `bias`: Bias vector for each time step.
- `reservoir_driver`: Mechanism for evolving reservoir states (default: `RNN()`).
- `nla_type`: Non-linear activation type (default: `NLADefault()`).
- `states_type`: Format for storing states (default: `StandardStates()`).
Expand Down
Loading