You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 31, 2023. It is now read-only.
Divide the data into n chunks, any k of which are able to recover the information (but with fewer than k shares, no information on the original data can be discerned). An example is Shamir's secret sharing.
This could be combined with other processors. The data payload could be split between a few cloud providers. By metadata, I mean for example an index or an ephemeral symmetric key. But the more common case could be to apply threshold secret sharing to the stream.
The text was updated successfully, but these errors were encountered:
Hi @Intensity. Sorry for the late reply, my notifications were disabled 🙈.
Indeed, that processing you suggest would take the form of a two new procs, one for splitting into parts (like parity), one for joining them (like uparity).
I don't need such a proc, so I must admit I don't have the motivation to invest time in it. Would you?
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Divide the data into
n
chunks, anyk
of which are able to recover the information (but with fewer thank
shares, no information on the original data can be discerned). An example is Shamir's secret sharing.This could be combined with other processors. The data payload could be split between a few cloud providers. By metadata, I mean for example an index or an ephemeral symmetric key. But the more common case could be to apply threshold secret sharing to the stream.
The text was updated successfully, but these errors were encountered: