Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

accuracy #8

Open
meltingCat opened this issue Jan 6, 2021 · 1 comment
Open

accuracy #8

meltingCat opened this issue Jan 6, 2021 · 1 comment

Comments

@meltingCat
Copy link

meltingCat commented Jan 6, 2021

Hi, I ran the scripts train_lenet_decolle.py with default setting and get the test_acc.npy in logs folder. The result is down below, am I doing right? What are the meaing in each column?

Python 3.8.3 (default, Jul  2 2020, 16:21:59) 
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy as np
>>> np.load('test_acc.npy')
array([[0.67275785, 0.96008969, 0.97578475, 0.97421525],
       [0.71950673, 0.9706278 , 0.98307175, 0.98273543],
       [0.74204036, 0.97253363, 0.98430493, 0.97443946],
       [0.75257848, 0.97443946, 0.98598655, 0.98363229],
       [0.76345291, 0.97600897, 0.98688341, 0.98542601],
       [0.77399103, 0.97780269, 0.98643498, 0.9853139 ],
       [0.78195067, 0.97959641, 0.98800448, 0.98508969],
       [0.78396861, 0.97847534, 0.98553812, 0.97970852],
       [0.78251121, 0.97959641, 0.98789238, 0.98508969],
       [0.78026906, 0.97982063, 0.98766816, 0.9793722 ],
       [0.78621076, 0.97982063, 0.98665919, 0.97701794],
       [0.79091928, 0.98038117, 0.98867713, 0.98441704],
       [0.79147982, 0.98038117, 0.98834081, 0.98408072],
       [0.79248879, 0.98127803, 0.9896861 , 0.97096413],
       [0.79809417, 0.98161435, 0.98878924, 0.98206278],
       [0.79002242, 0.97959641, 0.98834081, 0.982287  ],
       [0.7970852 , 0.98049327, 0.98946188, 0.97825112],
       [0.79585202, 0.98004484, 0.98811659, 0.9808296 ],
       [0.79730942, 0.9809417 , 0.98778027, 0.9808296 ],
       [0.7941704 , 0.9809417 , 0.98856502, 0.98520179],
       [0.79495516, 0.98161435, 0.9882287 , 0.98307175],
       [0.8014574 , 0.98139013, 0.98923767, 0.97679372],
       [0.79977578, 0.9794843 , 0.98744395, 0.98026906],
       [0.79977578, 0.98139013, 0.98800448, 0.98183857],
       [0.79742152, 0.98150224, 0.98834081, 0.97780269],
       [0.8       , 0.98150224, 0.98834081, 0.98172646],
       [0.80190583, 0.98195067, 0.98912556, 0.98195067],
       [0.80403587, 0.98116592, 0.98845291, 0.98127803],
       [0.80807175, 0.98060538, 0.98811659, 0.97993274],
       [0.80257848, 0.98150224, 0.98800448, 0.97813901],
       [0.80235426, 0.98206278, 0.99058296, 0.98475336],
       [0.80493274, 0.98172646, 0.99103139, 0.98497758],
       [0.80639013, 0.98217489, 0.9896861 , 0.98452915],
       [0.80672646, 0.98273543, 0.99024664, 0.9853139 ],
       [0.80325112, 0.98172646, 0.98957399, 0.98497758],
       [0.80560538, 0.98251121, 0.99013453, 0.98598655],
       [0.80526906, 0.98172646, 0.98991031, 0.98733184],
       [0.80257848, 0.98206278, 0.99002242, 0.98396861],
       [0.80504484, 0.982287  , 0.99002242, 0.98452915],
       [0.80515695, 0.98217489, 0.99069507, 0.9867713 ],
       [0.80818386, 0.98206278, 0.99035874, 0.98553812],
       [0.80493274, 0.9823991 , 0.99035874, 0.98699552],
       [0.80661435, 0.98262332, 0.99058296, 0.98542601],
       [0.80751121, 0.98195067, 0.99002242, 0.98710762],
       [0.80795964, 0.98206278, 0.99013453, 0.98621076],
       [0.80616592, 0.98262332, 0.99024664, 0.98654709],
       [0.8059417 , 0.98217489, 0.9896861 , 0.98497758],
       [0.8058296 , 0.982287  , 0.99024664, 0.98508969],
       [0.80538117, 0.98273543, 0.99002242, 0.98452915],
       [0.807287  , 0.98217489, 0.99024664, 0.98721973],
       [0.80695067, 0.98150224, 0.98912556, 0.98587444],
       [0.80773543, 0.98295964, 0.98979821, 0.98699552],
       [0.80807175, 0.98206278, 0.98991031, 0.98587444],
       [0.8088565 , 0.98139013, 0.98991031, 0.9853139 ],
       [0.80840807, 0.98206278, 0.99035874, 0.98396861],
       [0.80695067, 0.98262332, 0.99035874, 0.98609865],
       [0.80840807, 0.98251121, 0.99013453, 0.98542601],
       [0.80426009, 0.98262332, 0.9896861 , 0.98609865],
       [0.80784753, 0.9823991 , 0.99013453, 0.98396861],
       [0.807287  , 0.982287  , 0.99024664, 0.98553812],
       [0.80695067, 0.9823991 , 0.99058296, 0.98609865],
       [0.80616592, 0.98273543, 0.99047085, 0.98587444],
       [0.80852018, 0.9823991 , 0.99069507, 0.98643498],
       [0.80639013, 0.98273543, 0.98979821, 0.98699552],
       [0.8088565 , 0.982287  , 0.99058296, 0.98699552],
       [0.80762332, 0.98195067, 0.98991031, 0.98699552],
       [0.80807175, 0.98206278, 0.99013453, 0.98553812],
       [0.80639013, 0.98262332, 0.98991031, 0.98665919],
       [0.80616592, 0.98273543, 0.99047085, 0.98699552],
       [0.80773543, 0.9823991 , 0.99080717, 0.98654709],
       [0.80639013, 0.98195067, 0.99035874, 0.98710762],
       [0.80807175, 0.98251121, 0.99047085, 0.98778027],
       [0.80997758, 0.98195067, 0.99013453, 0.98621076],
       [0.80605381, 0.982287  , 0.99047085, 0.98621076],
       [0.80952915, 0.98251121, 0.99013453, 0.98733184],
       [0.80930493, 0.98206278, 0.98979821, 0.98699552],
       [0.8073991 , 0.98262332, 0.99024664, 0.98654709],
       [0.81053812, 0.98206278, 0.99024664, 0.98665919],
       [0.80930493, 0.98251121, 0.99002242, 0.98834081],
       [0.8088565 , 0.9823991 , 0.99047085, 0.98699552],
       [0.80896861, 0.98295964, 0.98991031, 0.98688341],
       [0.80919283, 0.98251121, 0.99002242, 0.98699552],
       [0.80930493, 0.98295964, 0.99002242, 0.98665919],
       [0.80840807, 0.9823991 , 0.99024664, 0.98699552],
       [0.8117713 , 0.98217489, 0.98946188, 0.9867713 ],
       [0.80964126, 0.98307175, 0.99058296, 0.98598655],
       [0.80852018, 0.98195067, 0.98991031, 0.98665919],
       [0.8103139 , 0.982287  , 0.99002242, 0.98688341],
       [0.81143498, 0.98251121, 0.99047085, 0.98755605],
       [0.80997758, 0.98284753, 0.99024664, 0.98721973],
       [0.81087444, 0.98217489, 0.9896861 , 0.98665919],
       [0.80762332, 0.98195067, 0.99024664, 0.98688341],
       [0.81121076, 0.98262332, 0.9896861 , 0.98699552],
       [0.80930493, 0.982287  , 0.98991031, 0.98654709],
       [0.80795964, 0.98251121, 0.99002242, 0.98665919],
       [0.80840807, 0.98295964, 0.99002242, 0.9867713 ],
       [0.8103139 , 0.98273543, 0.99002242, 0.9867713 ],
       [0.81076233, 0.98284753, 0.98991031, 0.98665919],
       [0.81053812, 0.9823991 , 0.99013453, 0.98654709]])
@eneftci
Copy link
Member

eneftci commented Jan 6, 2021

Yes, this looks correct. Each column is the accuracy in the given layer. So .99013453 is the accuracy of the third convolutional layer. The last column is the accuracy of a non-spiking softmax layer, assuming you didn't change the model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants