You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wrote an implementation of the fibonacci sequence in ArnoldC and found that numbers over 2^15 were represented properly, but numbers over 2^31 went round the number wheel and into negative. So it appears that the integer type would be a signed 32 bit integer instead of a signed 16 bit integer.
Hi, nice project!
I wrote an implementation of the fibonacci sequence in ArnoldC and found that numbers over 2^15 were represented properly, but numbers over 2^31 went round the number wheel and into negative. So it appears that the integer type would be a signed 32 bit integer instead of a signed 16 bit integer.
Links:
The only variable type in ArnoldC is a 16bit signed integer
: https://github.com/lhartikk/ArnoldC/wiki/ArnoldCThe text was updated successfully, but these errors were encountered: