Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reducing memory allocations #4

Open
2 of 4 tasks
azeno opened this issue Feb 20, 2021 · 0 comments
Open
2 of 4 tasks

Reducing memory allocations #4

azeno opened this issue Feb 20, 2021 · 0 comments

Comments

@azeno
Copy link
Member

azeno commented Feb 20, 2021

After profiling a project which made heavy use of this library (~70 sender/receiver nodes in total) we identified the following hot spots regarding memory allocation:

  • DatagramSender -> Socket.SendToAsync, not much we can do about it except wait, see Proposal: Zero allocation connectionless sockets dotnet/runtime#30797
  • MatchAddress - string splitting and MessagePattern construction adds up. Can the split be deferred until we indeed have a match? Can the message pattern be cached?
  • PadNull - If I understand it correctly OSC requires byte blobs and strings to be 4 byte aligned? In that case allocating the memory with the proper size would be more beneficial.
  • Unpack -> UnpackMessage and UnpackString. Both can be done more efficient using Span.

Attached is a screenshot from the profiler
image

azeno added a commit to azeno/VL.IO.OSC that referenced this issue Feb 21, 2021
azeno added a commit to azeno/VL.IO.OSC that referenced this issue Feb 21, 2021
- UnpackMessage and UnpackString use Span
- MessagePattern.IsMatch now static method based on span as well. Will only allocate if regex is indeed used.
- Use more memory friendly value tuple in OSCServer
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant