Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Grouping updates together into single transactions #196

Open
tasket opened this issue May 20, 2024 · 0 comments
Open

Grouping updates together into single transactions #196

tasket opened this issue May 20, 2024 · 0 comments
Labels
enhancement New feature or request
Milestone

Comments

@tasket
Copy link
Owner

tasket commented May 20, 2024

Grouping multiple updates from send (for example) into a single transaction may have advantages for both speed and integrity.

Ideas:

  • Add metadata .tmp updates to a queue in the ArchiveSet, instead of going through the normal save-and-rename chain.
  • Use a global transaction id and append it to the name of each metadata file (instead of using '.tmp' suffix).
  • Do all update_delta_digest() in one batch (for incrementals) and then group the zero-change vols together during send; then process the resulting metadata update queue. This saves a lot of latency overhead when --send-unchanged is used.
  • Possibly use this to simplify recovery from in_process interruption.
  • Possibly re-use tar stream process and start/manage it in monitor_send() instead.
@tasket tasket added the enhancement New feature or request label May 20, 2024
@tasket tasket added this to the v1.0 milestone May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant