Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NoisyLeaky behaviour: spiking and membrane potential reset independently stochastic #366

Open
leoauri opened this issue Feb 13, 2025 · 4 comments

Comments

@leoauri
Copy link

leoauri commented Feb 13, 2025

From what I observe of NoisyLeaky neurons, spiking and membrane potential reset are independently stochastic.

I am not at all experienced with SNNs, but wouldn't the expected behaviour be: iff a spike is emitted, membrane potential reset occurs?

Here is code where I observe this. I'm using reset mechanism "zero" here, but "subtract" produces similar behaviour:

import snntorch as snn
from snntorch import NoisyLeaky
import torch

n = NoisyLeaky(beta=0.5, output=True, reset_mechanism="zero")

steps = 20
outs = []
mem = torch.zeros(1)
for t in range(steps):
    out, mem = n(torch.Tensor([[1.0]]), mem)
    outs.append((out, mem))

outs

Example output:

[(tensor([[1.]]), tensor([[1.]])),
 (tensor([[0.]]), tensor([[0.]])),
 (tensor([[1.]]), tensor([[1.]])),
 (tensor([[0.]]), tensor([[0.]])),
 (tensor([[0.]]), tensor([[1.]])),
 (tensor([[1.]]), tensor([[1.5000]])),
 (tensor([[0.]]), tensor([[0.]])),
 (tensor([[1.]]), tensor([[1.]])),
 (tensor([[0.]]), tensor([[0.]])),
 (tensor([[0.]]), tensor([[1.]])),
 (tensor([[0.]]), tensor([[0.]])),
 (tensor([[1.]]), tensor([[1.]])),
 (tensor([[0.]]), tensor([[0.]])),
 (tensor([[1.]]), tensor([[1.]])),
 (tensor([[0.]]), tensor([[1.5000]])),
 (tensor([[0.]]), tensor([[0.]])),
 (tensor([[1.]]), tensor([[1.]])),
 (tensor([[1.]]), tensor([[1.5000]])),
 (tensor([[0.]]), tensor([[0.]])),
 (tensor([[1.]]), tensor([[1.]]))]

You can see here that the spikes and the reset do not generally coincide!

Here's a plot of that:

Image

I also adapted one of the tutorials trying to figure this out:

lif2 = snn.NoisyLeaky(beta=0.6)

# Initialize inputs and outputs
cur_in = torch.cat((torch.zeros(10, 1), torch.ones(190, 1)*0.2), 0)
mem = torch.zeros(1)
spk_out = torch.zeros(1) 
mem_rec = [mem]
spk_rec = [spk_out]

# Simulation run across 100 time steps.
for step in range(num_steps):
  spk_out, mem = lif2(cur_in[step], mem)
  mem_rec.append(mem)
  spk_rec.append(spk_out)

# convert lists to tensors
mem_rec = torch.stack(mem_rec)
spk_rec = torch.stack(spk_rec)

plot_cur_mem_spk(cur_in, mem_rec, spk_rec, thr_line=1, vline=109, ylim_max2=1.3, 
                 title="Lapicque Neuron Model With Step Input")

Image

If I have my head screwed on wrong, someone please correct that. Otherwise, it would be great if the authors of the pull request #230 @genema could shed light on this!

@leoauri
Copy link
Author

leoauri commented Feb 14, 2025

Does leoauri@7fe3f6f reflect the desired/expected behaviour?

...
        for t in range(steps):
            spk, mem = noisyleaky_reset_zero_output_instance(torch.Tensor([[1.0]]), mem)
            assert bool(spk) == (not bool(mem))

@leoauri
Copy link
Author

leoauri commented Feb 15, 2025

NoisyLeaky seems to mirror the design of Leaky. self.spike_grad is called twice in these classes, in mem_reset and in fire or fire_inhibition. In Leaky, spike_grad is set to SpikingNeuron.ATan.apply, but in NoisyLeaky it is set to one of the various stochastic functions, so you end up with two noise terms in there...

@leoauri
Copy link
Author

leoauri commented Feb 15, 2025

Is it the case that the membrane potential returned by the forward pass is pre-spike? So the membrane potential change is actually calculated in the subsequent forward pass through Leaky? @jeshraghian @ahenkes1

@leoauri
Copy link
Author

leoauri commented Feb 15, 2025

Pull request #368 addresses this. Feedback welcome @genema @jeshraghian

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant