You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
let a = BigDecimal::from_str("1e99999999999999").unwrap();let zero = BigDecimal::zero();let bool = a == zero;
I haven't spent a lot of time looking into what's happening here, but it looks like the lib is getting stuck in ten_to_the
I'd expect BigDecimal to function in reasonable time, or to throw errors if there are values it can't process efficiently. I wouldn't expect it to hang on processing valid values.
I'm running into some of these changes by using BigDecimal to parse some JSON inputs which aren't under my control - which I'm going to work on avoiding in future.
The text was updated successfully, but these errors were encountered:
DanielBauman88
changed the title
Comparisons are extremely slow
Comparisons with large exponents are extremely slow
Jul 24, 2023
I was hoping to optimize ten_to_the as well as PartialEq::eq.
As there often is, I have two competing algorithms. I used criterion to benchmark different values of n (in 10^n) and it looks like there's a clear point to switch over:
By my calculations (on my computer) it's about n=1410 (i.e. 10^1410).
Unfortunately, they're both obviously climbing rapidly, so I'm going to check a few more things.
The primary issue is fixed, I will squash commits and push soon. It requires allocations to compare decimals digit-by-digit, which will be a lot more efficient than the old code (written ~7 years ago!).
I haven't spent a lot of time looking into what's happening here, but it looks like the lib is getting stuck in
ten_to_the
I'd expect BigDecimal to function in reasonable time, or to throw errors if there are values it can't process efficiently. I wouldn't expect it to hang on processing valid values.
I'm running into some of these changes by using BigDecimal to parse some JSON inputs which aren't under my control - which I'm going to work on avoiding in future.
The text was updated successfully, but these errors were encountered: