-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add performance benchmarking #6
Comments
Wrk is a great tool, I had used it before, We can also use JMH for components like TeenyJson. Not sure if we need a script like profile.sh These are some results so far.
Cached Thread Pool alex@Alexs-MacBook-Pro ~ % wrk --latency -d 60s -c 100 -t 8 http://localhost:8080/store/products Work Stealing pool alex@Alexs-MacBook-Pro ~ % wrk --latency -d 60s -c 100 -t 8 http://localhost:8080/store/products Virtual Threads alex@Alexs-MacBook-Pro ~ % wrk --latency -d 60s -c 100 -t 8 http://localhost:8080/store/products Environment
👉 Wrk Tool We should have in mind this when working with virtual threads:
Ready to upgrade to java 21? 😬 I would like Teeny to have basic prometheus metrics
|
Results against TestServer.java
|
Plenty of scope for perf gains then! :) |
Just profiled Teeny (IJ profiler) and looks like there's a death-lock, everything performs well until ~30th concurrent request, made a few adjustments and then got ~850 req/sec, this will be fun 😀 |
I really wanted at some point to get virtual threads integrated on certain levels of jdk, etc. It would be interesting to find other areas to improve. |
I've been testing different strategies to make teeny more performant, I reached 1K/sec, one of them IMO the most important component to process more requests is the ThreadPoolExecutor, I just looked at this ThreadPoolExecutor.java and this thoughts? |
Let's get the perf up! |
Just implemented JMH at TeenyJson branch, these are the results so far
|
Ups, Looks like if you run Teeny from IntelliJ, teeny will underperform. These results are from running teeny from the cli
|
Teeny update:
alex@Alexs-MacBook-Pro ~ % ./bombardier -d 5s http://localhost:8080
Bombarding http://localhost:8080 for 5s using 125 connection(s)
[=========================================================================================================================================================] 5s
Done!
Statistics Avg Stdev Max
Reqs/sec 15177.85 3110.70 21408.30
Latency 8.20ms 9.08ms 218.10ms
HTTP codes:
1xx - 0, 2xx - 75834, 3xx - 0, 4xx - 0, 5xx - 0
others - 400
Errors:
the server closed connection before returning the first response byte. Make sure the server returns 'Connection: close' response header before closing the connection - 372
dial tcp [::1]:8080: connect: connection refused - 27
write tcp 127.0.0.1:53575->127.0.0.1:8080: write: broken pipe - 1
Throughput: 2.98MB/s |
It would be cool to use something like this to start performance benchmarking and improving the performance of TeenyHttpd.
Thoughts @alex-cova ?
The text was updated successfully, but these errors were encountered: