Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request: support >65536 pieces #2

Open
Bulat-Ziganshin opened this issue May 29, 2017 · 2 comments
Open

Request: support >65536 pieces #2

Bulat-Ziganshin opened this issue May 29, 2017 · 2 comments

Comments

@Bulat-Ziganshin
Copy link

65536 data+parity blocks limit is too low for some PAR3-style applications. I will be be glad to see it supported.

@catid
Copy link
Owner

catid commented May 29, 2017

Challenges:

(1) Error locator polynomial evaluation: Fast Walsh Transform on 2^32 words is not so fast? It would take 8.6 GB to represent 2^31 inputs...!

(2) Multiplication cannot be done via tables anymore, so it would get slower to calculate... Maybe Fp^2 would be better.

(3) The ErrorBitfield optimization class would be very important to implement but also very tricky.

(4) The current decoder wastes a lot of perf rounding up to the next power of two. I don't think that's going to fly when the data is this big!

(5) General memory handling issues...

@catid
Copy link
Owner

catid commented May 29, 2017

If applications require just slightly more than 2^16 blocks I could add e.g. 20 bit finite fields. I'd prefer to do that in response to a real demand rather than a precipitated need though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants