From 0131f70892f507693348fff75e534210f85fd6f7 Mon Sep 17 00:00:00 2001 From: martincostello Date: Tue, 10 Sep 2024 11:29:48 +0100 Subject: [PATCH 1/8] Add Continuous Benchmarks on a Budget Add skeleton of next blog post. --- ...-continuous-benchmarks-on-a-budget.html.md | 37 +++++++++++++++++++ 1 file changed, 37 insertions(+) create mode 100644 source/2024-09-17-continuous-benchmarks-on-a-budget.html.md diff --git a/source/2024-09-17-continuous-benchmarks-on-a-budget.html.md b/source/2024-09-17-continuous-benchmarks-on-a-budget.html.md new file mode 100644 index 0000000..10928bd --- /dev/null +++ b/source/2024-09-17-continuous-benchmarks-on-a-budget.html.md @@ -0,0 +1,37 @@ +--- +title: "Continuous Benchmarks on a Budget" +date: 2024-09-17 +tags: actions,benchmarks,benchmarkdotnet,ci,dotnet,github +layout: bloglayout +description: "Using GitHub Actions, GitHub Pages and Blazor to run and visualise continuous benchmarks with BenchmarkDotNet with zero hosting costs." +image: "https://cdn.martincostello.com/blog_TODO.png" +--- + +TODO + +TODO + +READMORE + +TODO + +## Summary + +TODO + +[aspnetcore-benchmarks-dashboard]: https://aka.ms/aspnet/benchmarks "ASP.NET Core Benchmarks Power BI Dashboard" +[aspnetcore-benchmarks-code]: https://github.com/aspnet/Benchmarks "ASP.NET Core Benchmarks on GitHub" +[aspnetcore-benchmarks-docs]: https://github.com/dotnet/aspnetcore/blob/main/docs/Benchmarks.md "ASP.NET Core Benchmarks on GitHub" +[benchmarkdotnet]: https://github.com/dotnet/BenchmarkDotNet "The BenchmarkDotNet repository on GitHub" +[benchmarks-data]: https://github.com/martincostello/benchmarks "Benchmarks data repository on GitHub" +[benchmarks-demo]: https://github.com/martincostello/benchmarks-demo "Benchmarks demo repository on GitHub" +[benchmarks-site]: https://benchmarks.martincostello.com/ "Benchmarks dashboard deployed to GitHub Pages" +[blazor]: https://learn.microsoft.com/aspnet/core/blazor/ "ASP.NET Core Blazor" +[dotnet-aspire]: https://learn.microsoft.com/dotnet/aspire/ ".NET Aspire documentation" +[dotnet-performance]: https://github.com/dotnet/performance "The dotnet/performance repository on GitHub" +[openapi-post]: https://blog.martincostello.com/whats-new-for-openapi-with-dotnet-9/ "What's New for OpenAPI with .NET 9" +[plotly]: https://plotly.com/javascript/ "Plotly JavaScript Open Source Graphing Library" +[power-bi]: https://learn.microsoft.com/power-bi/fundamentals/power-bi-overview "What is Power BI?" +[publisher-action]: https://github.com/martincostello/benchmarkdotnet-results-publisher "The benchmarkdotnet-results-publisher repository on GitHub" +[publisher-inspiration]: https://github.com/benchmark-action/github-action-benchmark "The github-action-benchmark repository on GitHub" +[regression-comment]: https://github.com/martincostello/project-euler/pull/335#issuecomment-2302688319 "Example of a comment on a pull request for a regression" From dffa2da9f4c4099407dd7959a959f129b994fbf8 Mon Sep 17 00:00:00 2001 From: martin-costello Date: Sun, 22 Sep 2024 13:31:23 +0100 Subject: [PATCH 2/8] Initial draft Initial draft of the blog post. --- ...hats-new-for-openapi-with-dotnet-9.html.md | 1047 +++++++++-------- ...-continuous-benchmarks-on-a-budget.html.md | 37 - ...-continuous-benchmarks-on-a-budget.html.md | 146 +++ 3 files changed, 670 insertions(+), 560 deletions(-) delete mode 100644 source/2024-09-17-continuous-benchmarks-on-a-budget.html.md create mode 100644 source/2024-09-23-continuous-benchmarks-on-a-budget.html.md diff --git a/source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md b/source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md index 3d6901b..f55422d 100644 --- a/source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md +++ b/source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md @@ -1,523 +1,524 @@ ---- -title: "What's New for OpenAPI with .NET 9" -date: 2024-09-09 -tags: dotnet,openapi,swagger,swashbuckle -layout: bloglayout -description: "A look at the new Microsoft.AspNetCore.OpenApi package in .NET 9 and comparing it to NSwag and Swashbuckle.AspNetCore." -image: "https://cdn.martincostello.com/blog_openapi.png" ---- - -The OpenAPI logo - -Developers in the .NET ecosystem have been writing APIs with ASP.NET and ASP.NET Core for years, and -[OpenAPI][openapi] (nÊe Swagger) has been a popular choice for documenting those APIs. -OpenAPI at its core is a machine-readable document that describes the endpoints available in an API. -It contains information not only about parameters, requests and responses, but also additional metadata -such as descriptions of properties, security-related metadata, and more. - -These documents can then be consumed by tools such as [Swagger UI][swagger-ui] to provide a user interface -for developers to interact with the API quickly and easily, such as when testing. With the recent surge in -popularity of AI-based development tools, OpenAPI has become even more important as a way to describe APIs -in a way that machines can understand. - -For a long time, the two most common libraries to produce API specifications at runtime for ASP.NET Core -have been [NSwag][nswag] and [Swashbuckle][swashbuckle]. Both libraries provide functionality that allows -developers to generate a rich OpenAPI document(s) for their APIs in either JSON and/or YAML from their -existing code. The endpoints can then be augmented in different ways, such as with attributes or custom -code, to further enrich the generated document(s) to provide a great Developer Experience for its consumers. - -With the upcoming release of [ASP.NET Core 9][aspnetcore-9], the ASP.NET team have introduced new functionality -for the existing [Microsoft.AspNetCore.OpenApi NuGet package][microsoft-aspnetcore-openapi], that provides a -new way to generate OpenAPI documents for ASP.NET Core Minimal APIs. - -In this post, we'll take a look at the new functionality and compare it to the exsisting NSwag and Swashbuckle -libraries to see how it compares in both features as well as performance. - -READMORE - -## Why a new OpenAPI library? - -You may wonder why when there's two existing and popular solutions for generating OpenAPI documents in ASP.NET Core -that there's a need for a third new option to enter the fray. While both NSwag and Swashbuckle have served the community -well for many years, recently both libraries have seen a decline in maintenance and updates. This has led to a lag -in the ability for new features of the framework to be leveraged and/or supported in these libraries with each new release. - -While Swashbuckle has had a bit of a resurgence in 2024 with the [announcement of new maintainers for the project][swashbuckle-maintainers] -(I'm one of them 👋) and now has first-class support for .NET 8, it is still an open source project that is provided -for free and maintained by volunteers in their spare time. With these constraints, it's difficult to keep up with the -pace of change in the .NET ecosystem with a new major release every year. By contrast, the ASP.NET team at Microsoft -are paid to work on the framework full-time, so can dedicate time to ensure that the libraries they provide are kept -up-to-date with the latest features and best practices as the product evolves over time. - -Another motivating factor for the new library is that [native AoT compilation][native-aot] is becoming an increasingly -popular way to deploy .NET applications, especially in the cloud, where reducing cold start times is important for -high-scale applications with variable load patterns. Both NSwag and Swashbuckle rely heavily on reflection to generate -their OpenAPI documents, but reflection has many constraints when used in an application compiled to run as native code. -This makes many existing code patterns in these libraries not work due to the metadata needed being trimmed away, as it -appears to be unused. - -While both libraries probably _could_ be refactored to support native AoT, this would be a significant amount of work to -undertake as it would require a significant rewrite of the core functionality of both libraries. Speaking as a Swashbuckle -maintainer, the amount of work required is so large compared to the benefits it would provide, that it's not something that -is realistically going to happen. - -A brand new library that is designed from the ground-up to support native AoT compilation and the latest features of ASP.NET -Core however is a very different proposition. Any new library is unburdened by the weight of its existing functionality, -and instead can start fresh with a new design that is more suited to the current state of the ASP.NET Core ecosystem and -its needs in 2024 and beyond. - -The Swashbuckle maintainers are also unconcerned that there's a new library on the scene. We don't consider it to be a -competitor to Swashbuckle - for example, the new library only supports ASP.NET Core 9 and later, whereas Swashbuckle has -a broader range of support for older versions of ASP.NET Core, including for .NET Framework. Users who want to use the -new functionality and wish to migrate are welcome to do so, but we're not going to stop maintaining Swashbuckle any time -soon. I'm sure many developers are happy with their existing library of choice and will continue to use it rather than invest -time and effort moving from one library to another. - -## Microsoft.AspNetCore.OpenApi Features (and Gaps) - -At a high-level, the new Microsoft.AspNetCore.OpenApi package has the same basic functionality as both NSwag and Swashbuckle. -It generates an OpenAPI document for your ASP.NET Core endpoints at runtime. The shape of your endpoints, such as their methods, -paths, requests, responses, parameters etc. are all derived from your application's code. The declaration can be extended with -metadata, such as with attributes like `[ProducesResponseType]` and `[Tags]`, to provide additional information to the -generation process to describe the endpoints and schemas as required for your needs. - -The library also integrates with the existing [Microsoft.Extensions.ApiDescription.Server package][m-e-apidescription-server] -to generate the document at build time via a custom MSBuild target that can run as part of compiling your project to produce -the OpenAPI document as a file on disk. This is useful for CI/CD scenarios like [linting][linting] - for example you could run -[spectral][spectral] as part of your build pipeline to validate that your OpenAPI document is valid and follows recommended best practices. - -Like Swashbuckle, the package is built on top of the [OpenAPI.NET][microsoft-openapi] library, which provides the C# types -for the various primitives of the [OpenAPI specification][openapi-specification]. The advantage of this is that adding support -for new versions of the OpenAPI specification in the future (e.g. OpenAPI 3.1) should be easier as the library can be updated -to use a new version that supports it in the future, with only the "glue" for generating the types from the endpoints needing -to be updating, rather than also needing to fully implement the specification itself. The generation of the JSON schemas for -the models is built on top of the [new JSON schema support][json-schema-exporter] in .NET 9, which is exposed by the new -`JsonSchemaExporter` class. - -OpenAPI support is added at the [endpoint][aspnetcore-endpoints] level (think `MapGet()` and similar methods). This allow the -the OpenAPI document can be coupled into other mechanisms in ASP.NET Core, such as authorization, caching, and more. - -As noted earlier, it is also fully compatible with native AoT, allowing you to generate OpenAPI documents for your ASP.NET Core -applications at runtime even when compiled to native code, such as when running in a container, if you want to expose your -API documentation to your users in your deployed environment. - -To add the minimal level of support for generating an OpenAPI document, you could add the following code to your ASP.NET Core -application after adding a reference to the Microsoft.AspNetCore.OpenApi NuGet package: - -``` -var builder = WebApplication.CreateBuilder(); - -// Add services for generating OpenAPI documents -builder.Services.AddOpenApi(); - -var app = builder.Build(); - -// Add the endpoint to get the OpenAPI document -app.MapOpenApi(); - -// Your API endpoints -app.MapGet("/", () => "Hello world!"); - -app.Run(); -``` - -Running the server and navigating to the `/openapi/v1/openapi.json` URL in a browser will then return a OpenAPI document as JSON -that describes the endpoints in your application. - -### Transformers - -If (or when) you need to enrich the document further, the library provides a number of extensions points that you can use to -extend the document, or individual operations and/or schemas, using the concept of _transformers_. Transformers provide a way -for you to run custom code to modify the OpenAPI document as it is being generated, allowing you to add additional metadata. - -Transformers can either be registered as inline delegates or as types that implement the appropriate transformer interface -(`IOpenApiDocumentTransformer`, `IOpenApiOperationTransformer` or `IOpenApiSchemaTransformer`). In the case of the interfaces, -this allows you to implement types that use various additional services (e.g. `IConfiguration`) in your implementations and -means they can be resolved from the dependency injection container used by your application. - -Here's an example of declaring and then using a document transformer: - -``` -// Add a custom service to the DI container -builder.Services.AddTransient(); - -// Add services for generating OpenAPI documents and register a custom document transformer -builder.Services.AddOpenApi(options => -{ - options.AddDocumentTransformer(); -}); - -// A custom implementation of IOpenApiDocumentTransformer that uses our custom service. -// The type is activated from the DI container, so can use other services in the application. -class MyDocumentTransformer(IMyService myService) : IOpenApiDocumentTransformer -{ - public async Task TransformAsync( - OpenApiDocument document, - OpenApiDocumentTransformerContext context, - CancellationToken cancellationToken) - { - // Use myService to modify the document in some way... - } -} -``` - -As another example of the power of these transformers, I've built a library of my own on top of these abstractions to add -additional capabilities for my own APIs. The [OpenAPI Extensions for ASP.NET Core][openapi-extensions] library provides a -number of transformers that can be used to add additional metadata to the OpenAPI document, such as support for generating -rich examples for requests, responses and schemas. - -### Feature Gaps - -As a first release however, there are a few feature gaps compared to what developers may come to expect from an OpenAPI -solution compared to NSwag and Swashbuckle. - -#### No User Interface - -Compared to the application templates that shipped with the .NET SDK in previous releases of ASP.NET Core, there is no -built-in solution to render a user interface on top of the OpenAPI document that is generated. - -I don't think this is a major gap at this stage, as it's still possible to add a Swagger UI to your application with ease -by continuing to use the [Swashbuckle.AspNetCore.SwaggerUI][swashbuckle-ui] NuGet package to provide one. This NuGet package -is independent from the rest of Swashbuckle, so can be used with the new OpenAPI library without any issues or bloat from -including two implementations. From version 6.6.2 of Swashbuckle.AspNetCore, this package also supports native AoT, so -doesn't compromise support for that either. - -#### No XML Comments - -For the .NET 9 release, there is no support for adding descriptions to the OpenAPI document from the XML documentation in -your code. This is a feature that is [planned for a future release][aspnetcore-openapi-xml], likely .NET 10, but a preview -of the feature is expected to be made available at some point before then. - -If this is critical for your application, you could investigate creating your own transformer to consume your XML -documentation until then. - -#### No support for YAML documents - -While both the Microsoft.OpenApi library and NSwag support generating OpenAPI documents in YAML (unlike Swashbuckle), the -Microsoft.AspNetCore.OpenApi package currently only supports generating OpenAPI documents in JSON. This is a feature that -could be added in a future release. - -This is again another piece of functionality I've added to my [OpenAPI Extensions for ASP.NET Core][openapi-extensions] library, so you -could use that to generate YAML documents if you need to. It's enabled with a single line of code in your application: - -``` -app.MapOpenApiYaml(); -``` - -## Comparison with NSwag and Swashbuckle - -So how does the new Microsoft.AspNetCore.OpenApi package compare to the existing NSwag and Swashbuckle libraries? - -While the goal of the library is not for 100% feature parity with either of the existing libraries, it does provide the -majority of the same functionality that developers would expect from an OpenAPI library for ASP.NET Core applications. -As noted above, the core gaps are support for XML comments and a built-in User Interface. - -If you'd like a more detailed comparison of the three libraries, you can check out this [GitHub repository][openapi-comparisons] -that implements a Todo API and exposes equivalent OpenAPI documents for it using all three libraries. This should give you -a good idea of how all three libraries express the same concepts and how you use them as an application developer. - -As an example, here's the code to add an OpenAPI document and customise the API info in all three implementations. - -One thing you'll notice that the same ability to customise the document is done through either similar concepts that are -named either _transformers_ (ASP.NET Core), _processors_ (NSwag) or _filters_ (Swashbuckle). - -### Microsoft.AspNetCore.OpenApi - -``` -public static IServiceCollection AddAspNetCoreOpenApi(this IServiceCollection services) -{ - services.AddOpenApi(options => - { - options.AddDocumentTransformer((document, _, _) => - { - document.Info.Title = "Todo API"; - document.Info.Description = "An API for managing Todo items."; - document.Info.Version = "v1"; - - return Task.CompletedTask; - }); - - options.AddOperationTransformer(new AddExamplesTransformer()); - }); - - return services; -} -``` - -[Code][example-aspnetcore] - -### NSwag - -``` -public static IServiceCollection AddNSwagOpenApi(this IServiceCollection services) -{ - services.AddOpenApiDocument(options => - { - options.Title = "Todo API"; - options.Description = "An API for managing Todo items."; - options.Version = "v1"; - - options.OperationProcessors.Add(new AddExamplesProcessor()); - }); - - return services; -} -``` - -[Code][example-nswag] - -### Swashbuckle - -``` -public static IServiceCollection AddSwashbuckleOpenApi(this IServiceCollection services) -{ - services.AddSwaggerGen(options => - { - var info = new OpenApiInfo - { - Title = "Todo API", - Description = "An API for managing Todo items.", - Version = "v1" - }; - - options.SwaggerDoc(info.Version, info); - options.OperationFilter(); - }); - - return services; -} -``` - -[Code][example-swashbuckle] - -## Performance - -The last thing I thought I'd touch on in this blog post is performance. After I'd created the repository comparing the three -implementations, I figured it would be interesting to benchmark them to compare how they perform when generating an OpenAPI document. - -### Preliminary Results with .NET 9 Preview 7 - -After a detour off into setting up a continuous benchmarking process (which I'll try and blog about separately some time soon), -I set up some benchmarks for each library with [BenchmarkDotNet][benchmarkdotnet] to compare the performance. When I first set -them up I was targeting the official Preview 7 release of .NET 9, and at a very high-level, these were the results I got: - -``` -BenchmarkDotNet v0.14.0, Ubuntu 22.04.4 LTS (Jammy Jellyfish) -AMD EPYC 7763, 1 CPU, 4 logical and 2 physical cores -.NET SDK 9.0.100-preview.7.24406.8 - [Host] : .NET 9.0.0 (9.0.24.40507), X64 RyuJIT AVX2 - ShortRun : .NET 9.0.0 (9.0.24.40507), X64 RyuJIT AVX2 - -Job=ShortRun IterationCount=3 LaunchCount=1 -WarmupCount=3 - -| Method | Mean | Error | StdDev | Gen0 | Gen1 | Gen2 | Allocated | -|------------ |----------:|----------:|----------:|---------:|---------:|---------:|----------:| -| AspNetCore | 10.988 ms | 13.319 ms | 0.7301 ms | 171.8750 | 140.6250 | 125.0000 | 6.02 MB | -| NSwag | 12.269 ms | 2.276 ms | 0.1247 ms | 15.6250 | - | - | 1.55 MB | -| Swashbuckle | 7.989 ms | 6.878 ms | 0.3770 ms | 15.6250 | - | - | 1.5 MB | -``` - -[Commit][benchmark-commit-preview7] - -As you can see from the data, the new OpenAPI package is roughly second along with NSwag in terms of performance, with Swashbuckle -ahead by a few milliseconds. However the new ASP.NET Core OpenAPI is _way_ behind in terms of memory usage, using nearly 4 times as -much as the other two libraries. You can also see from the graphs below from many runs over time with the preview 7 that there is -a lot of variance in the OpenAPI package's performance, compared to the other two libraries which are much more stable. - - -
-
-
- ASP.NET Core results for .NET 9 preview 7 -
-
- NSwag results for .NET 9 preview 7 -
-
- Swashbuckle results for .NET 9 preview 7 -
-
-
- -Not particularly great, but there's actually two interesting caveats to these benchmarks. - -The first is that [there is a bug][dotnet-aspnetcore-56990] in ASP.NET Core 9 Preview 7 that caused the OpenAPI document schemas to -not be stable between generations - this was leading to a lot of unncessary work being done, and was causing a memory leak that -eventually caused OpenAPI generation to stop working completely. Because of this issue, I had to cap the number of iterations the -benchmarks ran as a short run via `[ShortRunJob]`, otherwise the benchmarks would grind to a halt. This is also the cause of the -variance in the allocation numbers (the red line at the top of the first graph). - -The second caveat is that, by default, NSwag caches the OpenAPI document it generates, so out-of-the-box it will only ever generate -the OpenAPI document once. For the sake of comparison, I [disabled the caching][disabled-caching] in NSwag so that the document was -generated in full on each request. We _could_ level the playing field in the opposite direction by caching all three, but that's not -interesting for a performance comparison/test as we'd effectively just be benchmarking the caching 😄. - -### Gotta Go Fast đŸĻ”đŸ’¨ - -With some data to hand, I then took a look into what exactly the code was doing to see if there was anything obvious that could be -fixed or improved to speed things up. What was invaluable in this process was the [EventPipe Profiler][benchmarkdotnet-profiler] -that can be enabled in BenchmarkDotNet to capture a flame graph of the code being executed. Using [speedscope.app][speedscope] I -was able to visualise the code paths that were being executed and see where the time was being spent. With this information, I was -able to identify three different places where the OpenAPI generation was doing unnecessary work and causing the performance issues. - -#### Dictionary Lookups đŸ•ĩī¸đŸ“– - -The first thing I found was that the code seemed to be spending a lot of time in the `Enumerable.All()` method. Digging into this -further, I noticed that `IDictionary.Contains()` was being used in a number of places in the code along with the indexer. -This is a known performance trap in .NET, with this pattern leading to a double look-up, which can be avoided by instead using the -`TryGetValue()` method. - -In fact there's even a .NET analysis rule that covers this scenario: [CA1854][ca1854]. It turns out -[there's a bug][dotnet-roslyn-analyzers-7369] in this analyser that doesn't catch certain patterns of usage, which is why it wasn't -caught previously. - -Changing the code to use `TryGetValue()` instead was an easy enough change to make, but that didn't answer the question of why so -much time was being spent in `All()` in the first place. The reason for this turned out to be due to the way the OpenAPI library -was implementing [`IEqualityComparer`][iequalitycomparer] for the various types used to generate the OpenAPI document. - -Some custom equality comparers are implemented which are used to help test whether different OpenAPI schema "shapes" are equal to -each other or not. These objects in some cases contain dozens of properties, some of which are themselves dictionaries or arrays, -which can create a large object graph to traverse to compute the equality of. - -With some reordering to how the properties are computed based on expense/likelihood of being different, a lot of the cost of these -comparisons can be avoided and make things much faster in the majority of cases. - -I opened [a pull request][dotnet-aspnetcore-57208] to address both of these items, which once merged caused all of the identified -method calls to drop out of the hot path for the profiler traces in the benchmarks đŸ”Ĩ. - -#### Too Many Transformers 🤖 - -After the fix for the unstable schemas and the above changes, I took another look at the traces from my benchmark runs and -spotted one other anomaly from the data. Looking at the data, I noticed that [transformers were being created too often][dotnet-aspnetcore-57211]. - -This was due to an issue with the lifetime and disposal of transformers, meaning that they were being created once per _schema_, -rather than once per generation of the OpenAPI _document_. This then had not only the overhead of the additional work, but also -an impact to memory usage and garbage collection. - -### Latest Results with .NET 9 RC.1 - -After changes for the above issues were merged, I re-ran the benchmarks against the latest daily build of .NET 9 from their CI, -as at the time of writing, .NET 9 RC.1 isn't officially available yet. I've [written about using daily builds before][daily-builds], so -check out that post if you're interested. - -With the latest version of the .NET SDK from the .NET 9 CI (`9.0.100-rc.1.24452.12` at the time of writing) things are noticeably -improved compared to preview 7: - -``` -BenchmarkDotNet v0.14.0, Ubuntu 22.04.4 LTS (Jammy Jellyfish) -AMD EPYC 7763, 1 CPU, 4 logical and 2 physical cores -.NET SDK 9.0.100-rc.1.24452.12 - [Host] : .NET 9.0.0 (9.0.24.43107), X64 RyuJIT AVX2 - DefaultJob : .NET 9.0.0 (9.0.24.43107), X64 RyuJIT AVX2 - -| Method | Mean | Error | StdDev | Median | Gen0 | Allocated | -|------------ |-----------:|---------:|----------:|-----------:|--------:|-----------:| -| AspNetCore | 981.9 us | 15.94 us | 30.34 us | 975.3 us | - | 326.64 KB | -| NSwag | 4,570.8 us | 60.82 us | 53.92 us | 4,556.4 us | 15.6250 | 1588.43 KB | -| Swashbuckle | 2,768.2 us | 52.00 us | 124.58 us | 2,721.2 us | 15.6250 | 1527 KB | -``` - -[Commit][benchmark-commit-rc1] - -As you can see compared to the previous results, the OpenAPI package is now the fastest of the three libraries. - -The new ASP.NET Core package beats both NSwag and Swashbuckle by a significant margin, both in terms of time _and_ memory. ⚡ - -In fact it's almost **~2.8x** faster, and **~4.6x** less memory hungry that the nearest competitor. - -Compared to itself from preview 7, it's now **~11x** faster and allocates **~18x** less memory. That's a huge improvement! 🚀 - -
-
-
- ASP.NET Core results for .NET 9 preview 7 -
-
- NSwag results for .NET 9 preview 7 -
-
- Swashbuckle results for .NET 9 preview 7 -
-
-
- -The caveats to note here: - -- `[ShortRunJob]` is no longer used, so the benchmarks run more iterations and are thus more accurate. This is why - the error bars are much smaller in the second series of graphs. -- _All_ improvements between .NET 9 preview 7 and release candidate 1 are included, not just the fixes for OpenAPI. - This is most apparent from the major step down on the graph for all three libraries a few points in from the left. - This is where the benchmark project switches from using preview 7 to the daily RC1 builds. - -As ever, performance is relative to the environment used and your numbers might vary. However with a relatively stable -environment (GitHub Actions' Ubuntu runners in this case), the graphs show consistent performance across multiple runs -and a clear improvement as newer versions of .NET 9 are used. The useful data here is in the trends, not the absolute values. - -## Further Reading - -For more information on the new features in the Microsoft.AspNetCore.OpenApi package, check out these ASP.NET Community Standup -streams on YouTube. Here [Safia Abdalla][safia-abdalla], the engineer behind this new functionality, explains the new features -in the package and how to use them in your applications: - -- [OpenAPI Updates in .NET 9][aspnetcore-openapi-stream-1] -- [OpenAPI Updates in .NET 9 (Part 2)][aspnetcore-openapi-stream-2] - -The documentation for the package for ASP.NET Core 9 can be found in [Microsoft Learn][aspnetcore-openapi]. - -## Summary - -All in all, the new ASP.NET Core OpenAPI package is a great addition to the ASP.NET Core ecosystem. It provides a modern and -performant way to generate OpenAPI documents for your ASP.NET Core applications to cover the core use cases that developers need. - -While it may not yet be as feature-rich as existing libraries such as NSwag or Swashbuckle, it's better ability to keep up with -the change of pace to ASP.NET Core now and in the future, such as support for native AoT, give it a strong foundation to build -on going forwards, such as for future support for OpenAPI 3.1. - -Developers don't need to switch from their existing libraries to the new OpenAPI package if they're happy with their current -implementation - the only compelling reason to switch is if you want to generate OpenAPI documents in a native AoT deployment. -For those who do wish to switch (I have for a number of my apps), the migration is easiest for users of Swashbuckle.AspNetCore due -to both libraries being built on top of the same OpenAPI.NET foundation. - -If you've not added OpenAPI documentation to an API before and are writing a new ASP.NET Core 9+ application, I'd recommend giving -the library a try to see how it fits your needs. It's a great way to get started with OpenAPI documentation for your APIs! - -[aspnetcore-9]: https://learn.microsoft.com/aspnet/core/release-notes/aspnetcore-9.0 "What's new in ASP.NET Core 9.0" -[aspnetcore-endpoints]: https://learn.microsoft.com/aspnet/core/fundamentals/routing "Routing in ASP.NET Core" -[aspnetcore-openapi]: https://learn.microsoft.com/aspnet/core/fundamentals/openapi/aspnetcore-openapi?view=aspnetcore-9.0 "Work with OpenAPI documents on Microsoft Learn" -[aspnetcore-openapi-stream-1]: https://www.youtube.com/watch/XoMese9g8WQ "ASP.NET Community Standup - OpenAPI Updates in .NET 9 on YouTube" -[aspnetcore-openapi-stream-2]: https://www.youtube.com/watch/keK69Y5HqvY "ASP.NET Community Standup - OpenAPI Updates in .NET 9 (Part 2) on YouTube" -[aspnetcore-openapi-xml]: https://github.com/dotnet/aspnetcore/issues/39927#issuecomment-2233634912 "Support XML-based OpenAPI docs for minimal APIs" -[benchmarkdotnet]: https://github.com/dotnet/BenchmarkDotNet "The BenchmarkDotNet repository on GitHub" -[benchmarkdotnet-profiler]: https://benchmarkdotnet.org/articles/features/event-pipe-profiler.html "EventPipeProfiler" -[benchmark-commit-preview7]: https://github.com/martincostello/aspnetcore-openapi/commit/fd5d79a12deeeda3abc10b61a80f2568bd38b381 -[benchmark-commit-rc1]: https://github.com/martincostello/aspnetcore-openapi/commit/6a09d0422eeeabe38cc4ea7655af04d5d7209d11 -[ca1854]: https://learn.microsoft.com/dotnet/fundamentals/code-analysis/quality-rules/CA1854 "Prefer the IDictionary.TryGetValue(TKey, out TValue) method" -[daily-builds]: https://blog.martincostello.com/upgrading-to-dotnet-8-part-5-preview-7-and-rc-1-2/ "Daily Build Testing" -[disabled-caching]: https://github.com/martincostello/aspnetcore-openapi/blob/fd5d79a12deeeda3abc10b61a80f2568bd38b381/src/TodoApp/OpenApi/NSwag/NSwagOpenApiEndpoints.cs#L94-L97 -[dotnet-aspnetcore-56990]: https://github.com/dotnet/aspnetcore/issues/56990 "OpenAPI schemas are not stable between generations" -[dotnet-aspnetcore-57208]: https://github.com/dotnet/aspnetcore/pull/57208 "Use TryGetValue for dictionary lookups in OpenAPI comparers" -[dotnet-aspnetcore-57211]: https://github.com/dotnet/aspnetcore/pull/57211 "OpenAPI activates transformers too many times" -[dotnet-roslyn-analyzers-7369]: https://github.com/dotnet/roslyn-analyzers/issues/7369 "CA1854 isn't catching cases that aren't directly part of an if statement" -[example-aspnetcore]: https://github.com/martincostello/aspnetcore-openapi/blob/06b3aff0e5023cce8a5c8599507b4d974aedf37b/src/TodoApp/OpenApi/AspNetCore/AspNetCoreOpenApiEndpoints.cs -[example-nswag]: https://github.com/martincostello/aspnetcore-openapi/blob/06b3aff0e5023cce8a5c8599507b4d974aedf37b/src/TodoApp/OpenApi/NSwag/NSwagOpenApiEndpoints.cs -[example-swashbuckle]: https://github.com/martincostello/aspnetcore-openapi/blob/06b3aff0e5023cce8a5c8599507b4d974aedf37b/src/TodoApp/OpenApi/Swashbuckle/SwashbuckleOpenApiEndpoints.cs -[iequalitycomparer]: https://learn.microsoft.com/dotnet/api/system.collections.generic.iequalitycomparer-1 "IEqualityComparer Interface" -[json-schema-exporter]: https://github.com/dotnet/core/blob/main/release-notes/9.0/preview/preview6/libraries.md#jsonschemaexporter "JsonSchemaExporter" -[linting]: https://learn.microsoft.com/aspnet/core/fundamentals/openapi/aspnetcore-openapi?view=aspnetcore-9.0#lint-generated-openapi-documents-with-spectral "Lint generated OpenAPI documents with Spectral" -[microsoft-aspnetcore-openapi]: https://www.nuget.org/packages/Microsoft.AspNetCore.OpenApi "The Microsoft.AspNetCore.OpenApi package on NuGet.org" -[m-e-apidescription-server]: https://www.nuget.org/packages/Microsoft.Extensions.ApiDescription.Server/ "The Microsoft.Extensions.ApiDescription.Server package on NuGet.org" -[microsoft-openapi]: https://github.com/microsoft/OpenAPI.NET "The OpenAPI.NET repository on GitHub" -[native-aot]: https://learn.microsoft.com/dotnet/core/deploying/native-aot "Native AOT deployment" -[nswag]: https://github.com/RicoSuter/NSwag "The NSwag repository on GitHub" -[openapi]: https://swagger.io/docs/specification/about/ "What Is OpenAPI?" -[openapi-comparisons]: https://github.com/martincostello/aspnetcore-openapi "A GitHub repository comparing OpenAPI implementations for ASP.NET Core" -[openapi-extensions]: https://github.com/martincostello/openapi-extensions "The OpenAPI Extensions repository on GitHub" -[openapi-specification]: https://swagger.io/specification/ "The OpenAPI specification" -[safia-abdalla]: https://github.com/captainsafia "@captainsafia on GitHub" -[spectral]: https://github.com/stoplightio/spectral "The Spectral repository on GitHub" -[speedscope]: https://www.speedscope.app/ "speedscope" -[swagger-ui]: https://github.com/swagger-api/swagger-ui "The Swagger UI repository on GitHub" -[swashbuckle]: https://github.com/domaindrivendev/Swashbuckle.AspNetCore "The Swashbuckle.AspNetCore repository on GitHub" -[swashbuckle-maintainers]: https://github.com/domaindrivendev/Swashbuckle.AspNetCore/discussions/2778 "Swashbuckle.AspNetCore maintainers announcement" -[swashbuckle-ui]: https://www.nuget.org/packages/Swashbuckle.AspNetCore.SwaggerUI "The Swashbuckle.AspNetCore.SwaggerUI package on NuGet.org" +--- +title: "What's New for OpenAPI with .NET 9" +date: 2024-09-09 +tags: dotnet,openapi,swagger,swashbuckle +layout: bloglayout +description: "A look at the new Microsoft.AspNetCore.OpenApi package in .NET 9 and comparing it to NSwag and Swashbuckle.AspNetCore." +image: "https://cdn.martincostello.com/blog_openapi.png" +--- + +The OpenAPI logo + +Developers in the .NET ecosystem have been writing APIs with ASP.NET and ASP.NET Core for years, and +[OpenAPI][openapi] (nÊe Swagger) has been a popular choice for documenting those APIs. +OpenAPI at its core is a machine-readable document that describes the endpoints available in an API. +It contains information not only about parameters, requests and responses, but also additional metadata +such as descriptions of properties, security-related metadata, and more. + +These documents can then be consumed by tools such as [Swagger UI][swagger-ui] to provide a user interface +for developers to interact with the API quickly and easily, such as when testing. With the recent surge in +popularity of AI-based development tools, OpenAPI has become even more important as a way to describe APIs +in a way that machines can understand. + +For a long time, the two most common libraries to produce API specifications at runtime for ASP.NET Core +have been [NSwag][nswag] and [Swashbuckle][swashbuckle]. Both libraries provide functionality that allows +developers to generate a rich OpenAPI document(s) for their APIs in either JSON and/or YAML from their +existing code. The endpoints can then be augmented in different ways, such as with attributes or custom +code, to further enrich the generated document(s) to provide a great Developer Experience for its consumers. + +With the upcoming release of [ASP.NET Core 9][aspnetcore-9], the ASP.NET team have introduced new functionality +for the existing [Microsoft.AspNetCore.OpenApi NuGet package][microsoft-aspnetcore-openapi], that provides a +new way to generate OpenAPI documents for ASP.NET Core Minimal APIs. + +In this post, we'll take a look at the new functionality and compare it to the exsisting NSwag and Swashbuckle +libraries to see how it compares in both features as well as performance. + +READMORE + +## Why a new OpenAPI library? + +You may wonder why when there's two existing and popular solutions for generating OpenAPI documents in ASP.NET Core +that there's a need for a third new option to enter the fray. While both NSwag and Swashbuckle have served the community +well for many years, recently both libraries have seen a decline in maintenance and updates. This has led to a lag +in the ability for new features of the framework to be leveraged and/or supported in these libraries with each new release. + +While Swashbuckle has had a bit of a resurgence in 2024 with the [announcement of new maintainers for the project][swashbuckle-maintainers] +(I'm one of them 👋) and now has first-class support for .NET 8, it is still an open source project that is provided +for free and maintained by volunteers in their spare time. With these constraints, it's difficult to keep up with the +pace of change in the .NET ecosystem with a new major release every year. By contrast, the ASP.NET team at Microsoft +are paid to work on the framework full-time, so can dedicate time to ensure that the libraries they provide are kept +up-to-date with the latest features and best practices as the product evolves over time. + +Another motivating factor for the new library is that [native AoT compilation][native-aot] is becoming an increasingly +popular way to deploy .NET applications, especially in the cloud, where reducing cold start times is important for +high-scale applications with variable load patterns. Both NSwag and Swashbuckle rely heavily on reflection to generate +their OpenAPI documents, but reflection has many constraints when used in an application compiled to run as native code. +This makes many existing code patterns in these libraries not work due to the metadata needed being trimmed away, as it +appears to be unused. + +While both libraries probably _could_ be refactored to support native AoT, this would be a significant amount of work to +undertake as it would require a significant rewrite of the core functionality of both libraries. Speaking as a Swashbuckle +maintainer, the amount of work required is so large compared to the benefits it would provide, that it's not something that +is realistically going to happen. + +A brand new library that is designed from the ground-up to support native AoT compilation and the latest features of ASP.NET +Core however is a very different proposition. Any new library is unburdened by the weight of its existing functionality, +and instead can start fresh with a new design that is more suited to the current state of the ASP.NET Core ecosystem and +its needs in 2024 and beyond. + +The Swashbuckle maintainers are also unconcerned that there's a new library on the scene. We don't consider it to be a +competitor to Swashbuckle - for example, the new library only supports ASP.NET Core 9 and later, whereas Swashbuckle has +a broader range of support for older versions of ASP.NET Core, including for .NET Framework. Users who want to use the +new functionality and wish to migrate are welcome to do so, but we're not going to stop maintaining Swashbuckle any time +soon. I'm sure many developers are happy with their existing library of choice and will continue to use it rather than invest +time and effort moving from one library to another. + +## Microsoft.AspNetCore.OpenApi Features (and Gaps) + +At a high-level, the new Microsoft.AspNetCore.OpenApi package has the same basic functionality as both NSwag and Swashbuckle. +It generates an OpenAPI document for your ASP.NET Core endpoints at runtime. The shape of your endpoints, such as their methods, +paths, requests, responses, parameters etc. are all derived from your application's code. The declaration can be extended with +metadata, such as with attributes like `[ProducesResponseType]` and `[Tags]`, to provide additional information to the +generation process to describe the endpoints and schemas as required for your needs. + +The library also integrates with the existing [Microsoft.Extensions.ApiDescription.Server package][m-e-apidescription-server] +to generate the document at build time via a custom MSBuild target that can run as part of compiling your project to produce +the OpenAPI document as a file on disk. This is useful for CI/CD scenarios like [linting][linting] - for example you could run +[spectral][spectral] as part of your build pipeline to validate that your OpenAPI document is valid and follows recommended best practices. + +Like Swashbuckle, the package is built on top of the [OpenAPI.NET][microsoft-openapi] library, which provides the C# types +for the various primitives of the [OpenAPI specification][openapi-specification]. The advantage of this is that adding support +for new versions of the OpenAPI specification in the future (e.g. OpenAPI 3.1) should be easier as the library can be updated +to use a new version that supports it in the future, with only the "glue" for generating the types from the endpoints needing +to be updating, rather than also needing to fully implement the specification itself. The generation of the JSON schemas for +the models is built on top of the [new JSON schema support][json-schema-exporter] in .NET 9, which is exposed by the new +`JsonSchemaExporter` class. + +OpenAPI support is added at the [endpoint][aspnetcore-endpoints] level (think `MapGet()` and similar methods). This allow the +the OpenAPI document can be coupled into other mechanisms in ASP.NET Core, such as authorization, caching, and more. + +As noted earlier, it is also fully compatible with native AoT, allowing you to generate OpenAPI documents for your ASP.NET Core +applications at runtime even when compiled to native code, such as when running in a container, if you want to expose your +API documentation to your users in your deployed environment. + +To add the minimal level of support for generating an OpenAPI document, you could add the following code to your ASP.NET Core +application after adding a reference to the Microsoft.AspNetCore.OpenApi NuGet package: + +``` +var builder = WebApplication.CreateBuilder(); + +// Add services for generating OpenAPI documents +builder.Services.AddOpenApi(); + +var app = builder.Build(); + +// Add the endpoint to get the OpenAPI document +app.MapOpenApi(); + +// Your API endpoints +app.MapGet("/", () => "Hello world!"); + +app.Run(); +``` + +Running the server and navigating to the `/openapi/v1/openapi.json` URL in a browser will then return a OpenAPI document as JSON +that describes the endpoints in your application. + +### Transformers + +If (or when) you need to enrich the document further, the library provides a number of extensions points that you can use to +extend the document, or individual operations and/or schemas, using the concept of _transformers_. Transformers provide a way +for you to run custom code to modify the OpenAPI document as it is being generated, allowing you to add additional metadata. + +Transformers can either be registered as inline delegates or as types that implement the appropriate transformer interface +(`IOpenApiDocumentTransformer`, `IOpenApiOperationTransformer` or `IOpenApiSchemaTransformer`). In the case of the interfaces, +this allows you to implement types that use various additional services (e.g. `IConfiguration`) in your implementations and +means they can be resolved from the dependency injection container used by your application. + +Here's an example of declaring and then using a document transformer: + +``` +// Add a custom service to the DI container +builder.Services.AddTransient(); + +// Add services for generating OpenAPI documents and register a custom document transformer +builder.Services.AddOpenApi(options => +{ + options.AddDocumentTransformer(); +}); + +// A custom implementation of IOpenApiDocumentTransformer that uses our custom service. +// The type is activated from the DI container, so can use other services in the application. +class MyDocumentTransformer(IMyService myService) : IOpenApiDocumentTransformer +{ + public async Task TransformAsync( + OpenApiDocument document, + OpenApiDocumentTransformerContext context, + CancellationToken cancellationToken) + { + // Use myService to modify the document in some way... + } +} +``` + +As another example of the power of these transformers, I've built a library of my own on top of these abstractions to add +additional capabilities for my own APIs. The [OpenAPI Extensions for ASP.NET Core][openapi-extensions] library provides a +number of transformers that can be used to add additional metadata to the OpenAPI document, such as support for generating +rich examples for requests, responses and schemas. + +### Feature Gaps + +As a first release however, there are a few feature gaps compared to what developers may come to expect from an OpenAPI +solution compared to NSwag and Swashbuckle. + +#### No User Interface + +Compared to the application templates that shipped with the .NET SDK in previous releases of ASP.NET Core, there is no +built-in solution to render a user interface on top of the OpenAPI document that is generated. + +I don't think this is a major gap at this stage, as it's still possible to add a Swagger UI to your application with ease +by continuing to use the [Swashbuckle.AspNetCore.SwaggerUI][swashbuckle-ui] NuGet package to provide one. This NuGet package +is independent from the rest of Swashbuckle, so can be used with the new OpenAPI library without any issues or bloat from +including two implementations. From version 6.6.2 of Swashbuckle.AspNetCore, this package also supports native AoT, so +doesn't compromise support for that either. + +#### No XML Comments + +For the .NET 9 release, there is no support for adding descriptions to the OpenAPI document from the XML documentation in +your code. This is a feature that is [planned for a future release][aspnetcore-openapi-xml], likely .NET 10, but a preview +of the feature is expected to be made available at some point before then. + +If this is critical for your application, you could investigate creating your own transformer to consume your XML +documentation until then. + +#### No support for YAML documents + +While both the Microsoft.OpenApi library and NSwag support generating OpenAPI documents in YAML (unlike Swashbuckle), the +Microsoft.AspNetCore.OpenApi package currently only supports generating OpenAPI documents in JSON. This is a feature that +could be added in a future release. + +This is again another piece of functionality I've added to my [OpenAPI Extensions for ASP.NET Core][openapi-extensions] library, so you +could use that to generate YAML documents if you need to. It's enabled with a single line of code in your application: + +``` +app.MapOpenApiYaml(); +``` + +## Comparison with NSwag and Swashbuckle + +So how does the new Microsoft.AspNetCore.OpenApi package compare to the existing NSwag and Swashbuckle libraries? + +While the goal of the library is not for 100% feature parity with either of the existing libraries, it does provide the +majority of the same functionality that developers would expect from an OpenAPI library for ASP.NET Core applications. +As noted above, the core gaps are support for XML comments and a built-in User Interface. + +If you'd like a more detailed comparison of the three libraries, you can check out this [GitHub repository][openapi-comparisons] +that implements a Todo API and exposes equivalent OpenAPI documents for it using all three libraries. This should give you +a good idea of how all three libraries express the same concepts and how you use them as an application developer. + +As an example, here's the code to add an OpenAPI document and customise the API info in all three implementations. + +One thing you'll notice that the same ability to customise the document is done through either similar concepts that are +named either _transformers_ (ASP.NET Core), _processors_ (NSwag) or _filters_ (Swashbuckle). + +### Microsoft.AspNetCore.OpenApi + +``` +public static IServiceCollection AddAspNetCoreOpenApi(this IServiceCollection services) +{ + services.AddOpenApi(options => + { + options.AddDocumentTransformer((document, _, _) => + { + document.Info.Title = "Todo API"; + document.Info.Description = "An API for managing Todo items."; + document.Info.Version = "v1"; + + return Task.CompletedTask; + }); + + options.AddOperationTransformer(new AddExamplesTransformer()); + }); + + return services; +} +``` + +[Code][example-aspnetcore] + +### NSwag + +``` +public static IServiceCollection AddNSwagOpenApi(this IServiceCollection services) +{ + services.AddOpenApiDocument(options => + { + options.Title = "Todo API"; + options.Description = "An API for managing Todo items."; + options.Version = "v1"; + + options.OperationProcessors.Add(new AddExamplesProcessor()); + }); + + return services; +} +``` + +[Code][example-nswag] + +### Swashbuckle + +``` +public static IServiceCollection AddSwashbuckleOpenApi(this IServiceCollection services) +{ + services.AddSwaggerGen(options => + { + var info = new OpenApiInfo + { + Title = "Todo API", + Description = "An API for managing Todo items.", + Version = "v1" + }; + + options.SwaggerDoc(info.Version, info); + options.OperationFilter(); + }); + + return services; +} +``` + +[Code][example-swashbuckle] + +## Performance + +The last thing I thought I'd touch on in this blog post is performance. After I'd created the repository comparing the three +implementations, I figured it would be interesting to benchmark them to compare how they perform when generating an OpenAPI document. + +### Preliminary Results with .NET 9 Preview 7 + +After a detour off into setting up a continuous benchmarking process ([read about it here][continuous-benchmarks]), +I set up some benchmarks for each library with [BenchmarkDotNet][benchmarkdotnet] to compare the performance. When I first set +them up I was targeting the official Preview 7 release of .NET 9, and at a very high-level, these were the results I got: + +``` +BenchmarkDotNet v0.14.0, Ubuntu 22.04.4 LTS (Jammy Jellyfish) +AMD EPYC 7763, 1 CPU, 4 logical and 2 physical cores +.NET SDK 9.0.100-preview.7.24406.8 + [Host] : .NET 9.0.0 (9.0.24.40507), X64 RyuJIT AVX2 + ShortRun : .NET 9.0.0 (9.0.24.40507), X64 RyuJIT AVX2 + +Job=ShortRun IterationCount=3 LaunchCount=1 +WarmupCount=3 + +| Method | Mean | Error | StdDev | Gen0 | Gen1 | Gen2 | Allocated | +|------------ |----------:|----------:|----------:|---------:|---------:|---------:|----------:| +| AspNetCore | 10.988 ms | 13.319 ms | 0.7301 ms | 171.8750 | 140.6250 | 125.0000 | 6.02 MB | +| NSwag | 12.269 ms | 2.276 ms | 0.1247 ms | 15.6250 | - | - | 1.55 MB | +| Swashbuckle | 7.989 ms | 6.878 ms | 0.3770 ms | 15.6250 | - | - | 1.5 MB | +``` + +[Commit][benchmark-commit-preview7] + +As you can see from the data, the new OpenAPI package is roughly second along with NSwag in terms of performance, with Swashbuckle +ahead by a few milliseconds. However the new ASP.NET Core OpenAPI is _way_ behind in terms of memory usage, using nearly 4 times as +much as the other two libraries. You can also see from the graphs below from many runs over time with the preview 7 that there is +a lot of variance in the OpenAPI package's performance, compared to the other two libraries which are much more stable. + + +
+
+
+ ASP.NET Core results for .NET 9 preview 7 +
+
+ NSwag results for .NET 9 preview 7 +
+
+ Swashbuckle results for .NET 9 preview 7 +
+
+
+ +Not particularly great, but there's actually two interesting caveats to these benchmarks. + +The first is that [there is a bug][dotnet-aspnetcore-56990] in ASP.NET Core 9 Preview 7 that caused the OpenAPI document schemas to +not be stable between generations - this was leading to a lot of unncessary work being done, and was causing a memory leak that +eventually caused OpenAPI generation to stop working completely. Because of this issue, I had to cap the number of iterations the +benchmarks ran as a short run via `[ShortRunJob]`, otherwise the benchmarks would grind to a halt. This is also the cause of the +variance in the allocation numbers (the red line at the top of the first graph). + +The second caveat is that, by default, NSwag caches the OpenAPI document it generates, so out-of-the-box it will only ever generate +the OpenAPI document once. For the sake of comparison, I [disabled the caching][disabled-caching] in NSwag so that the document was +generated in full on each request. We _could_ level the playing field in the opposite direction by caching all three, but that's not +interesting for a performance comparison/test as we'd effectively just be benchmarking the caching 😄. + +### Gotta Go Fast đŸĻ”đŸ’¨ + +With some data to hand, I then took a look into what exactly the code was doing to see if there was anything obvious that could be +fixed or improved to speed things up. What was invaluable in this process was the [EventPipe Profiler][benchmarkdotnet-profiler] +that can be enabled in BenchmarkDotNet to capture a flame graph of the code being executed. Using [speedscope.app][speedscope] I +was able to visualise the code paths that were being executed and see where the time was being spent. With this information, I was +able to identify three different places where the OpenAPI generation was doing unnecessary work and causing the performance issues. + +#### Dictionary Lookups đŸ•ĩī¸đŸ“– + +The first thing I found was that the code seemed to be spending a lot of time in the `Enumerable.All()` method. Digging into this +further, I noticed that `IDictionary.Contains()` was being used in a number of places in the code along with the indexer. +This is a known performance trap in .NET, with this pattern leading to a double look-up, which can be avoided by instead using the +`TryGetValue()` method. + +In fact there's even a .NET analysis rule that covers this scenario: [CA1854][ca1854]. It turns out +[there's a bug][dotnet-roslyn-analyzers-7369] in this analyser that doesn't catch certain patterns of usage, which is why it wasn't +caught previously. + +Changing the code to use `TryGetValue()` instead was an easy enough change to make, but that didn't answer the question of why so +much time was being spent in `All()` in the first place. The reason for this turned out to be due to the way the OpenAPI library +was implementing [`IEqualityComparer`][iequalitycomparer] for the various types used to generate the OpenAPI document. + +Some custom equality comparers are implemented which are used to help test whether different OpenAPI schema "shapes" are equal to +each other or not. These objects in some cases contain dozens of properties, some of which are themselves dictionaries or arrays, +which can create a large object graph to traverse to compute the equality of. + +With some reordering to how the properties are computed based on expense/likelihood of being different, a lot of the cost of these +comparisons can be avoided and make things much faster in the majority of cases. + +I opened [a pull request][dotnet-aspnetcore-57208] to address both of these items, which once merged caused all of the identified +method calls to drop out of the hot path for the profiler traces in the benchmarks đŸ”Ĩ. + +#### Too Many Transformers 🤖 + +After the fix for the unstable schemas and the above changes, I took another look at the traces from my benchmark runs and +spotted one other anomaly from the data. Looking at the data, I noticed that [transformers were being created too often][dotnet-aspnetcore-57211]. + +This was due to an issue with the lifetime and disposal of transformers, meaning that they were being created once per _schema_, +rather than once per generation of the OpenAPI _document_. This then had not only the overhead of the additional work, but also +an impact to memory usage and garbage collection. + +### Latest Results with .NET 9 RC.1 + +After changes for the above issues were merged, I re-ran the benchmarks against the latest daily build of .NET 9 from their CI, +as at the time of writing, .NET 9 RC.1 isn't officially available yet. I've [written about using daily builds before][daily-builds], so +check out that post if you're interested. + +With the latest version of the .NET SDK from the .NET 9 CI (`9.0.100-rc.1.24452.12` at the time of writing) things are noticeably +improved compared to preview 7: + +``` +BenchmarkDotNet v0.14.0, Ubuntu 22.04.4 LTS (Jammy Jellyfish) +AMD EPYC 7763, 1 CPU, 4 logical and 2 physical cores +.NET SDK 9.0.100-rc.1.24452.12 + [Host] : .NET 9.0.0 (9.0.24.43107), X64 RyuJIT AVX2 + DefaultJob : .NET 9.0.0 (9.0.24.43107), X64 RyuJIT AVX2 + +| Method | Mean | Error | StdDev | Median | Gen0 | Allocated | +|------------ |-----------:|---------:|----------:|-----------:|--------:|-----------:| +| AspNetCore | 981.9 us | 15.94 us | 30.34 us | 975.3 us | - | 326.64 KB | +| NSwag | 4,570.8 us | 60.82 us | 53.92 us | 4,556.4 us | 15.6250 | 1588.43 KB | +| Swashbuckle | 2,768.2 us | 52.00 us | 124.58 us | 2,721.2 us | 15.6250 | 1527 KB | +``` + +[Commit][benchmark-commit-rc1] + +As you can see compared to the previous results, the OpenAPI package is now the fastest of the three libraries. + +The new ASP.NET Core package beats both NSwag and Swashbuckle by a significant margin, both in terms of time _and_ memory. ⚡ + +In fact it's almost **~2.8x** faster, and **~4.6x** less memory hungry that the nearest competitor. + +Compared to itself from preview 7, it's now **~11x** faster and allocates **~18x** less memory. That's a huge improvement! 🚀 + +
+
+
+ ASP.NET Core results for .NET 9 preview 7 +
+
+ NSwag results for .NET 9 preview 7 +
+
+ Swashbuckle results for .NET 9 preview 7 +
+
+
+ +The caveats to note here: + +- `[ShortRunJob]` is no longer used, so the benchmarks run more iterations and are thus more accurate. This is why + the error bars are much smaller in the second series of graphs. +- _All_ improvements between .NET 9 preview 7 and release candidate 1 are included, not just the fixes for OpenAPI. + This is most apparent from the major step down on the graph for all three libraries a few points in from the left. + This is where the benchmark project switches from using preview 7 to the daily RC1 builds. + +As ever, performance is relative to the environment used and your numbers might vary. However with a relatively stable +environment (GitHub Actions' Ubuntu runners in this case), the graphs show consistent performance across multiple runs +and a clear improvement as newer versions of .NET 9 are used. The useful data here is in the trends, not the absolute values. + +## Further Reading + +For more information on the new features in the Microsoft.AspNetCore.OpenApi package, check out these ASP.NET Community Standup +streams on YouTube. Here [Safia Abdalla][safia-abdalla], the engineer behind this new functionality, explains the new features +in the package and how to use them in your applications: + +- [OpenAPI Updates in .NET 9][aspnetcore-openapi-stream-1] +- [OpenAPI Updates in .NET 9 (Part 2)][aspnetcore-openapi-stream-2] + +The documentation for the package for ASP.NET Core 9 can be found in [Microsoft Learn][aspnetcore-openapi]. + +## Summary + +All in all, the new ASP.NET Core OpenAPI package is a great addition to the ASP.NET Core ecosystem. It provides a modern and +performant way to generate OpenAPI documents for your ASP.NET Core applications to cover the core use cases that developers need. + +While it may not yet be as feature-rich as existing libraries such as NSwag or Swashbuckle, it's better ability to keep up with +the change of pace to ASP.NET Core now and in the future, such as support for native AoT, give it a strong foundation to build +on going forwards, such as for future support for OpenAPI 3.1. + +Developers don't need to switch from their existing libraries to the new OpenAPI package if they're happy with their current +implementation - the only compelling reason to switch is if you want to generate OpenAPI documents in a native AoT deployment. +For those who do wish to switch (I have for a number of my apps), the migration is easiest for users of Swashbuckle.AspNetCore due +to both libraries being built on top of the same OpenAPI.NET foundation. + +If you've not added OpenAPI documentation to an API before and are writing a new ASP.NET Core 9+ application, I'd recommend giving +the library a try to see how it fits your needs. It's a great way to get started with OpenAPI documentation for your APIs! + +[aspnetcore-9]: https://learn.microsoft.com/aspnet/core/release-notes/aspnetcore-9.0 "What's new in ASP.NET Core 9.0" +[aspnetcore-endpoints]: https://learn.microsoft.com/aspnet/core/fundamentals/routing "Routing in ASP.NET Core" +[aspnetcore-openapi]: https://learn.microsoft.com/aspnet/core/fundamentals/openapi/aspnetcore-openapi?view=aspnetcore-9.0 "Work with OpenAPI documents on Microsoft Learn" +[aspnetcore-openapi-stream-1]: https://www.youtube.com/watch/XoMese9g8WQ "ASP.NET Community Standup - OpenAPI Updates in .NET 9 on YouTube" +[aspnetcore-openapi-stream-2]: https://www.youtube.com/watch/keK69Y5HqvY "ASP.NET Community Standup - OpenAPI Updates in .NET 9 (Part 2) on YouTube" +[aspnetcore-openapi-xml]: https://github.com/dotnet/aspnetcore/issues/39927#issuecomment-2233634912 "Support XML-based OpenAPI docs for minimal APIs" +[benchmarkdotnet]: https://github.com/dotnet/BenchmarkDotNet "The BenchmarkDotNet repository on GitHub" +[benchmarkdotnet-profiler]: https://benchmarkdotnet.org/articles/features/event-pipe-profiler.html "EventPipeProfiler" +[benchmark-commit-preview7]: https://github.com/martincostello/aspnetcore-openapi/commit/fd5d79a12deeeda3abc10b61a80f2568bd38b381 +[benchmark-commit-rc1]: https://github.com/martincostello/aspnetcore-openapi/commit/6a09d0422eeeabe38cc4ea7655af04d5d7209d11 +[ca1854]: https://learn.microsoft.com/dotnet/fundamentals/code-analysis/quality-rules/CA1854 "Prefer the IDictionary.TryGetValue(TKey, out TValue) method" +[continuous-benchmarks]: https://blog.martincostello.com/continuous-benchmarks-on-a-budget/ +[daily-builds]: https://blog.martincostello.com/upgrading-to-dotnet-8-part-5-preview-7-and-rc-1-2/ "Daily Build Testing" +[disabled-caching]: https://github.com/martincostello/aspnetcore-openapi/blob/fd5d79a12deeeda3abc10b61a80f2568bd38b381/src/TodoApp/OpenApi/NSwag/NSwagOpenApiEndpoints.cs#L94-L97 +[dotnet-aspnetcore-56990]: https://github.com/dotnet/aspnetcore/issues/56990 "OpenAPI schemas are not stable between generations" +[dotnet-aspnetcore-57208]: https://github.com/dotnet/aspnetcore/pull/57208 "Use TryGetValue for dictionary lookups in OpenAPI comparers" +[dotnet-aspnetcore-57211]: https://github.com/dotnet/aspnetcore/pull/57211 "OpenAPI activates transformers too many times" +[dotnet-roslyn-analyzers-7369]: https://github.com/dotnet/roslyn-analyzers/issues/7369 "CA1854 isn't catching cases that aren't directly part of an if statement" +[example-aspnetcore]: https://github.com/martincostello/aspnetcore-openapi/blob/06b3aff0e5023cce8a5c8599507b4d974aedf37b/src/TodoApp/OpenApi/AspNetCore/AspNetCoreOpenApiEndpoints.cs +[example-nswag]: https://github.com/martincostello/aspnetcore-openapi/blob/06b3aff0e5023cce8a5c8599507b4d974aedf37b/src/TodoApp/OpenApi/NSwag/NSwagOpenApiEndpoints.cs +[example-swashbuckle]: https://github.com/martincostello/aspnetcore-openapi/blob/06b3aff0e5023cce8a5c8599507b4d974aedf37b/src/TodoApp/OpenApi/Swashbuckle/SwashbuckleOpenApiEndpoints.cs +[iequalitycomparer]: https://learn.microsoft.com/dotnet/api/system.collections.generic.iequalitycomparer-1 "IEqualityComparer Interface" +[json-schema-exporter]: https://github.com/dotnet/core/blob/main/release-notes/9.0/preview/preview6/libraries.md#jsonschemaexporter "JsonSchemaExporter" +[linting]: https://learn.microsoft.com/aspnet/core/fundamentals/openapi/aspnetcore-openapi?view=aspnetcore-9.0#lint-generated-openapi-documents-with-spectral "Lint generated OpenAPI documents with Spectral" +[microsoft-aspnetcore-openapi]: https://www.nuget.org/packages/Microsoft.AspNetCore.OpenApi "The Microsoft.AspNetCore.OpenApi package on NuGet.org" +[m-e-apidescription-server]: https://www.nuget.org/packages/Microsoft.Extensions.ApiDescription.Server/ "The Microsoft.Extensions.ApiDescription.Server package on NuGet.org" +[microsoft-openapi]: https://github.com/microsoft/OpenAPI.NET "The OpenAPI.NET repository on GitHub" +[native-aot]: https://learn.microsoft.com/dotnet/core/deploying/native-aot "Native AOT deployment" +[nswag]: https://github.com/RicoSuter/NSwag "The NSwag repository on GitHub" +[openapi]: https://swagger.io/docs/specification/about/ "What Is OpenAPI?" +[openapi-comparisons]: https://github.com/martincostello/aspnetcore-openapi "A GitHub repository comparing OpenAPI implementations for ASP.NET Core" +[openapi-extensions]: https://github.com/martincostello/openapi-extensions "The OpenAPI Extensions repository on GitHub" +[openapi-specification]: https://swagger.io/specification/ "The OpenAPI specification" +[safia-abdalla]: https://github.com/captainsafia "@captainsafia on GitHub" +[spectral]: https://github.com/stoplightio/spectral "The Spectral repository on GitHub" +[speedscope]: https://www.speedscope.app/ "speedscope" +[swagger-ui]: https://github.com/swagger-api/swagger-ui "The Swagger UI repository on GitHub" +[swashbuckle]: https://github.com/domaindrivendev/Swashbuckle.AspNetCore "The Swashbuckle.AspNetCore repository on GitHub" +[swashbuckle-maintainers]: https://github.com/domaindrivendev/Swashbuckle.AspNetCore/discussions/2778 "Swashbuckle.AspNetCore maintainers announcement" +[swashbuckle-ui]: https://www.nuget.org/packages/Swashbuckle.AspNetCore.SwaggerUI "The Swashbuckle.AspNetCore.SwaggerUI package on NuGet.org" diff --git a/source/2024-09-17-continuous-benchmarks-on-a-budget.html.md b/source/2024-09-17-continuous-benchmarks-on-a-budget.html.md deleted file mode 100644 index 10928bd..0000000 --- a/source/2024-09-17-continuous-benchmarks-on-a-budget.html.md +++ /dev/null @@ -1,37 +0,0 @@ ---- -title: "Continuous Benchmarks on a Budget" -date: 2024-09-17 -tags: actions,benchmarks,benchmarkdotnet,ci,dotnet,github -layout: bloglayout -description: "Using GitHub Actions, GitHub Pages and Blazor to run and visualise continuous benchmarks with BenchmarkDotNet with zero hosting costs." -image: "https://cdn.martincostello.com/blog_TODO.png" ---- - -TODO - -TODO - -READMORE - -TODO - -## Summary - -TODO - -[aspnetcore-benchmarks-dashboard]: https://aka.ms/aspnet/benchmarks "ASP.NET Core Benchmarks Power BI Dashboard" -[aspnetcore-benchmarks-code]: https://github.com/aspnet/Benchmarks "ASP.NET Core Benchmarks on GitHub" -[aspnetcore-benchmarks-docs]: https://github.com/dotnet/aspnetcore/blob/main/docs/Benchmarks.md "ASP.NET Core Benchmarks on GitHub" -[benchmarkdotnet]: https://github.com/dotnet/BenchmarkDotNet "The BenchmarkDotNet repository on GitHub" -[benchmarks-data]: https://github.com/martincostello/benchmarks "Benchmarks data repository on GitHub" -[benchmarks-demo]: https://github.com/martincostello/benchmarks-demo "Benchmarks demo repository on GitHub" -[benchmarks-site]: https://benchmarks.martincostello.com/ "Benchmarks dashboard deployed to GitHub Pages" -[blazor]: https://learn.microsoft.com/aspnet/core/blazor/ "ASP.NET Core Blazor" -[dotnet-aspire]: https://learn.microsoft.com/dotnet/aspire/ ".NET Aspire documentation" -[dotnet-performance]: https://github.com/dotnet/performance "The dotnet/performance repository on GitHub" -[openapi-post]: https://blog.martincostello.com/whats-new-for-openapi-with-dotnet-9/ "What's New for OpenAPI with .NET 9" -[plotly]: https://plotly.com/javascript/ "Plotly JavaScript Open Source Graphing Library" -[power-bi]: https://learn.microsoft.com/power-bi/fundamentals/power-bi-overview "What is Power BI?" -[publisher-action]: https://github.com/martincostello/benchmarkdotnet-results-publisher "The benchmarkdotnet-results-publisher repository on GitHub" -[publisher-inspiration]: https://github.com/benchmark-action/github-action-benchmark "The github-action-benchmark repository on GitHub" -[regression-comment]: https://github.com/martincostello/project-euler/pull/335#issuecomment-2302688319 "Example of a comment on a pull request for a regression" diff --git a/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md b/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md new file mode 100644 index 0000000..ea79ec9 --- /dev/null +++ b/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md @@ -0,0 +1,146 @@ +--- +title: "Continuous Benchmarks on a Budget" +date: 2024-09-23 +tags: actions,benchmarks,benchmarkdotnet,ci,dotnet,github +layout: bloglayout +description: "Using GitHub Actions, GitHub Pages and Blazor to run and visualise continuous benchmarks with BenchmarkDotNet with zero hosting costs." +image: "https://cdn.martincostello.com/blog_benchmarks-regression.png" +--- + +A chart showing a time series for performance and memory usage with an increase in memory usage in the most recent data points + +Over the last few months I've been doing a bunch of testing with the [new OpenAPI support in .NET 9][openapi-post]. +As part of that testing, I wanted to take a look at how the performance of the new libraries compared to the existing +open source libraries for OpenAPI support in .NET, the most popular including [NSwag][nswag] and [Swashbuckle.AspNetCore][swashbuckle]. + +It's fairly easy to get up and running writing some benchmarks using [BenchmarkDotNet][benchmarkdotnet], but it's often a task +that you need to sit down and do manually when you have the need, and then gets forgotten about as time goes on. Because of that, +I thought it would be a fun mini-project to set up some automation to run the benchmarks on a continuous basis so that I could +monitor the performance of my open source projects easily going forwards. + +In this post I'll cover how I went about setting up a continuous benchmarking pipeline using GitHub Actions, [GitHub Pages][github-pages] +and [Blazor][blazor] to run and visualise the results of the benchmarks on a "good enough" basis without needing to spend any money on infrastructure. + +READMORE + +## The Ideal + +In an ideal world, we'd all have access to a dedicated performance lab, with a number of dedicated high-specification physical +machines that we could use to run benchmarks on a regular basis. We could generate reams of data from these benchmarks, and then +ingest that data into a data warehousing solution and run reports, generate dashboards and much more to monitor performance metrics +for the software we're building. + +The .NET team is an engineering team with the budget for such a setup, and they have engineers dedicated to performance testing +and the supporting infrastructure needed to run them. For example they have a [Power BI dashboard][aspnetcore-benchmarks-dashboard] +that they use to track the performance of the ASP.NET Core framework over time using dozens of benchmarks for [ASP.NET Core][aspnetcore-benchmarks-code] +and the [.NET Runtime][dotnet-performance] that the product engineers can use to test the impact of changes they make, and are run +on a regular basis to identify regressions. You can read more about their [benchmarks][aspnetcore-benchmarks-docs] in GitHub. + +As great as that would be, we're not all the likes of Microsoft, especially in the open source world. I don't know about you, but +I certainly don't have the budget to maintain a dedicated performance lab, physical or virtual, and a data warehouse to run on top +of it. How can we as open source software developers leverage the free tools available to us today to achieve something similar in +spirit to an Enterprise-level solution that still gives us value? + +## Prior Art and Inspiration + +I'm a big fan of GitHub Actions, and use it for all of my own software projects to build and deploy my software, as well to automate +other tasks like [applying monthly .NET SDK updates][dotnet-automated-patching] or housekeeping tasks like [clearing out old Azure Container Registry images][acr-housekeeping]. +GitHub Actions also comes with a generous free tier for public repositories - at the time of writing you get unlimited minutes for +running GitHub Actions workflows, capped at [20 concurrent jobs][github-actions-limits] for Linux and Windows runners (macOS is less +generous, at 5). + +GitHub Actions isn't ever going to be a like-for-like replacement for dedicated performance machines, especially on the free tier +rather than with custom dedicated runners, but it's a great alternative. We can't rely on these runners to give us accurate _absolute_ +benchmark results (i.e. how fast can my code possibly ever go), but we can use them to give us good _relative_ benchmark results to +produce trends over time. There will still be an element of noise in the results due to the shared nature of the runners because +we have no control over the underlying hardware they run on, so they may change unexpectedly over time as the service is upgraded, +but that's a trade-off that can be balanced against the usefulness of such a architecture for a _"budget performance lab"_. + +Given that, my first thought was that someone must have already written a GitHub Action to run benchmarks and collect the data for +them. Indeed, that was the case and the action I found that ended up being a major source of inspiration for my own setup was the +[benchmark-action/github-action-benchmark][publisher-inspiration] action. + +The action supports 10 existing performance testing tools, including BenchmarkDotNet for .NET, other tools for Go, Java and Python, +as well as custom tools. The action ingests the output of these tools, summarises the results into a JSON document, and then +pushes the results into a GitHub repository. It also commits static assets like HTML, CSS and JavaScript files to the repository +alongside the results so that you can view the results in a web browser. The static pages include charts generated using [Chart.js][chart-js] +so that you can view trends in the data over time and spot regressions. The action can also be configured to comment on pull requests +or commits if it determines that a regression has occurred in the benchmark data, removing some of the burden of needing to watch +for changes by eye. + +By setting up a [GitHub Pages][github-pages] site to serve a website for the content of the repository, you can use the static HTML +files to visualise the results of the benchmarks in a web browser. GitHub Pages is free to use, so using a public GitHub repository (free) +to store the data in conjunction with GitHub Pages to view the results (free) and GitHub Actions to run BenchmarkDotNet to generate the +results (free), you can see how we've got all the pieces in place to host a continuous benchmarking solution without needing a budget for +any hardware or infrastructure. + +## The Solution + +OK, so if there's already an action do to all of this, why did I go and write my own version of it? While the existing action is great, +because it's focused on multiple different tools, there's an element of _least common denominator_ to the features it has. The key feature +that it lacked for BenchmarkDotNet was the ability to visualise memory allocations in the charts in addition to the time/duration for +the benchmarks. There were also a number of minor other things I wanted to be able to do that the existing action didn't support out-of-the-box, +like customing Git commit details. + +While the UI it provides by default is functional, and it's possible to create your own custom UI to visualise the data, the JavaScript to +generate the dashboards hasn't really been designed with testability and extensibility in mind (in my opinion). As I started to customise +the provided code over a week or so to meet my needs, I found I was often breaking it with unintentional regressions, and it was difficult +to test in the form it's provided in by default. + +With that in mind, I decided I would create my own fork-in-spirit of the original action, but with a focus on BenchmarkDotNet. This would +allow me to customise the UI to my needs, and to make it more testable and extensible in the future. Also, a new side-project is always a +fun excuse to learn some new technology! + +### Generating the Benchmarks + +TODO + +### Storing the Data + +TODO + +### Visualising the Data + +TODO + +## Concrete Results + +TODO + +## Summary + +TODO + +[acr-housekeeping]: https://github.com/martincostello/github-automation/blob/adf8d8b14b6b8ac7be8ca8f30614ac4dfb137642/.github/workflows/acr-housekeeping.yml "A GitHub Actions workflow to clean up ACR images" +[aspnetcore-benchmarks-dashboard]: https://aka.ms/aspnet/benchmarks "ASP.NET Core Benchmarks Power BI Dashboard" +[aspnetcore-benchmarks-code]: https://github.com/aspnet/Benchmarks "ASP.NET Core Benchmarks on GitHub" +[aspnetcore-benchmarks-docs]: https://github.com/dotnet/aspnetcore/blob/main/docs/Benchmarks.md "ASP.NET Core Benchmarks on GitHub" +[benchmarkdotnet]: https://github.com/dotnet/BenchmarkDotNet "The BenchmarkDotNet repository on GitHub" +[benchmarks-data]: https://github.com/martincostello/benchmarks "Benchmarks data repository on GitHub" +[benchmarks-demo]: https://github.com/martincostello/benchmarks-demo "Benchmarks demo repository on GitHub" +[benchmarks-site]: https://benchmarks.martincostello.com/ "Benchmarks dashboard deployed to GitHub Pages" +[blazor]: https://learn.microsoft.com/aspnet/core/blazor/ "ASP.NET Core Blazor" +[bunit]: https://bunit.dev/ "bUnit: a testing library for Blazor components" +[chart-js]: https://www.chartjs.org "Chart.js Website" +[dotnet-aspire]: https://learn.microsoft.com/dotnet/aspire/ ".NET Aspire documentation" +[dotnet-automated-patching]: https://www.youtube.com/live/pOeT1otTi4M?si=9OEq-rm_DTopVNd1&t=172 "On .NET Live - Effortless .NET updates with GitHub Actions" +[dotnet-performance]: https://github.com/dotnet/performance "The dotnet/performance repository on GitHub" +[github-actions-limits]: https://docs.github.com/en/actions/administering-github-actions/usage-limits-billing-and-administration "GitHub Actions usage limits, billing and administration" +[github-pages]: https://pages.github.com/ "GitHub Pages" +[nswag]: https://github.com/RicoSuter/NSwag "The NSwag repository on GitHub" +[openapi-post]: https://blog.martincostello.com/whats-new-for-openapi-with-dotnet-9/ "What's New for OpenAPI with .NET 9" +[plotly]: https://plotly.com/javascript/ "Plotly JavaScript Open Source Graphing Library" +[power-bi]: https://learn.microsoft.com/power-bi/fundamentals/power-bi-overview "What is Power BI?" +[publisher-action]: https://github.com/martincostello/benchmarkdotnet-results-publisher "The benchmarkdotnet-results-publisher repository on GitHub" +[publisher-inspiration]: https://github.com/benchmark-action/github-action-benchmark "The github-action-benchmark repository on GitHub" +[regression-comment]: https://github.com/martincostello/project-euler/pull/335#issuecomment-2302688319 "Example of a comment on a pull request for a regression" +[runtime-regression]: https://github.com/dotnet/runtime/issues/107869 "Performance regression with JsonObject creation by +70%" +[swashbuckle]: https://github.com/domaindrivendev/Swashbuckle.AspNetCore "The Swashbuckle.AspNetCore repository on GitHub" + + From 7d02c9594f87af243c37588f2dea7eb0f5ee1d6e Mon Sep 17 00:00:00 2001 From: martin-costello Date: Sun, 22 Sep 2024 15:58:26 +0100 Subject: [PATCH 3/8] Add more content Add more sections of the post. --- ...-continuous-benchmarks-on-a-budget.html.md | 164 ++++++++++++++++-- 1 file changed, 145 insertions(+), 19 deletions(-) diff --git a/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md b/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md index ea79ec9..4bfe74c 100644 --- a/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md +++ b/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md @@ -22,10 +22,13 @@ I thought it would be a fun mini-project to set up some automation to run the be monitor the performance of my open source projects easily going forwards. In this post I'll cover how I went about setting up a continuous benchmarking pipeline using GitHub Actions, [GitHub Pages][github-pages] -and [Blazor][blazor] to run and visualise the results of the benchmarks on a "good enough" basis without needing to spend any money on infrastructure. +and [Blazor][blazor] to run and visualise the results of the benchmarks on a "good enough" basis without needing to spend any money__*__ +on infrastructure. READMORE +_*Unless you want to use this with GitHub Enterprise Server or non-public repositories. More information about this later._ + ## The Ideal In an ideal world, we'd all have access to a dedicated performance lab, with a number of dedicated high-specification physical @@ -34,10 +37,11 @@ ingest that data into a data warehousing solution and run reports, generate dash for the software we're building. The .NET team is an engineering team with the budget for such a setup, and they have engineers dedicated to performance testing -and the supporting infrastructure needed to run them. For example they have a [Power BI dashboard][aspnetcore-benchmarks-dashboard] -that they use to track the performance of the ASP.NET Core framework over time using dozens of benchmarks for [ASP.NET Core][aspnetcore-benchmarks-code] -and the [.NET Runtime][dotnet-performance] that the product engineers can use to test the impact of changes they make, and are run -on a regular basis to identify regressions. You can read more about their [benchmarks][aspnetcore-benchmarks-docs] in GitHub. +and the supporting infrastructure needed to run them. For example they have a [dashboard][aspnetcore-benchmarks-dashboard] using +[Power BI][power-bi] that they use to track the performance of the ASP.NET Core framework over time using dozens of benchmarks for +[ASP.NET Core][aspnetcore-benchmarks-code] and the [.NET Runtime][dotnet-performance] that the product engineers can use to test +the impact of changes they make, and are run on a regular basis to identify regressions. You can read more about their +[benchmarks][aspnetcore-benchmarks-docs] in GitHub. As great as that would be, we're not all the likes of Microsoft, especially in the open source world. I don't know about you, but I certainly don't have the budget to maintain a dedicated performance lab, physical or virtual, and a data warehouse to run on top @@ -46,18 +50,18 @@ spirit to an Enterprise-level solution that still gives us value? ## Prior Art and Inspiration -I'm a big fan of GitHub Actions, and use it for all of my own software projects to build and deploy my software, as well to automate -other tasks like [applying monthly .NET SDK updates][dotnet-automated-patching] or housekeeping tasks like [clearing out old Azure Container Registry images][acr-housekeeping]. -GitHub Actions also comes with a generous free tier for public repositories - at the time of writing you get unlimited minutes for -running GitHub Actions workflows, capped at [20 concurrent jobs][github-actions-limits] for Linux and Windows runners (macOS is less -generous, at 5). +I'm a big fan of [GitHub Actions][github-actions], and use it for all of my own software projects to build and deploy my software, +as well to automate other tasks like [applying monthly .NET SDK updates][dotnet-automated-patching] or housekeeping tasks like +[clearing out old Azure Container Registry images][acr-housekeeping]. GitHub Actions also comes with a generous free tier for public +repositories - at the time of writing you get unlimited minutes for running GitHub Actions workflows, capped at +[20 concurrent jobs][github-actions-limits] for Linux and Windows runners (macOS is less generous, at 5). GitHub Actions isn't ever going to be a like-for-like replacement for dedicated performance machines, especially on the free tier rather than with custom dedicated runners, but it's a great alternative. We can't rely on these runners to give us accurate _absolute_ benchmark results (i.e. how fast can my code possibly ever go), but we can use them to give us good _relative_ benchmark results to produce trends over time. There will still be an element of noise in the results due to the shared nature of the runners because we have no control over the underlying hardware they run on, so they may change unexpectedly over time as the service is upgraded, -but that's a trade-off that can be balanced against the usefulness of such a architecture for a _"budget performance lab"_. +but that's a trade-off that can be balanced against the usefulness of such an architecture for a _"budget performance lab"_. Given that, my first thought was that someone must have already written a GitHub Action to run benchmarks and collect the data for them. Indeed, that was the case and the action I found that ended up being a major source of inspiration for my own setup was the @@ -72,10 +76,10 @@ or commits if it determines that a regression has occurred in the benchmark data for changes by eye. By setting up a [GitHub Pages][github-pages] site to serve a website for the content of the repository, you can use the static HTML -files to visualise the results of the benchmarks in a web browser. GitHub Pages is free to use, so using a public GitHub repository (free) +files to visualise the results of the benchmarks in a browser. GitHub Pages is free to use, so using a public GitHub repository (free) to store the data in conjunction with GitHub Pages to view the results (free) and GitHub Actions to run BenchmarkDotNet to generate the results (free), you can see how we've got all the pieces in place to host a continuous benchmarking solution without needing a budget for -any hardware or infrastructure. +any hardware, infrastructure or hosting. ## The Solution @@ -94,17 +98,129 @@ With that in mind, I decided I would create my own fork-in-spirit of the origina allow me to customise the UI to my needs, and to make it more testable and extensible in the future. Also, a new side-project is always a fun excuse to learn some new technology! +### Storing the Data + +The first part of the solution, the data storage, is the easiest part. For this, all I needed to do was create a new public GitHub repository +([martincostello/benchmarks][benchmarks-data], how imaginative). For the design, the repository uses its branches to represent branches in +the source repository, with the data for each specific repository stored in a directory named after the repository. The data is then stored +in JSON files checked into the repository, providing a history of the benchmark data over time that can be tracked using standard Git tools. + +Using a dedicated repository for the data has a number of benefits: + +- All the data is stored in one repository, making it easy to access and query (and potentially migrate in the future if I need to change the data format); +- The benchmark results do not affect the Git history of the source repository; +- The dashboard can be deployed and versioned independently of the data - otherwise there'd be a lot of churn in the repository as the data is pushed. + +The main trade-off here, compared to storing data in the source repository, is that each repository generating benchmark results needs to +have a GitHub access token configured that has write access to the data repository. This is just a minor inconvenience in terms of needing +to add it to the neccessary repositories, rather than security concern. There's nothing stored in the data repository other than the data +and GitHub files (README etc.). + ### Generating the Benchmarks -TODO +For the second part of the solution, I created a new custom GitHub Action based on the existing action: [martincostello/benchmarkdotnet-results-publisher][publisher-action] -### Storing the Data +The action is written in TypeScript, so it runs as a native JavaScript action in GitHub Actions workflows, rather than needing any +additional software to be installed on a GitHub Actions runner. -TODO +Some of the improvements I made for my version of the action include: + +- Add data points for memory allocations when present to the Benchmark.NET results' JSON; +- Support for customising the Git commit message and author details; +- Pushing the data to the repository using the GitHub API to avoid the need to clone the data repository; +- Pushing the data as valid JSON, rather than as a JavaScript object literal assigned to a `window` global variable. + +With the action published, the next step is to use it to generate the benchmark results from the source repositories. + +I won't get into the specifics of writing the actual benchmarks using BenchmarkDotNet, but the key part is a GitHub Actions workflow +([example][benchmarks-workflow]) that runs the benchmarks using a GitHub-hosted Linux runner. At the time of writing, these use Ubuntu +22.04 using x64 processors and have 1 CPU with 4 logical cores. The workflow then uses the action once the benchmarks have been run +to publish the results to the [benchmarks repository][benchmarks-data]. The workflow runs for all pushes to a number of branches in the +repository, as well as being able to be run on-demand if needed. + +I've chosen not to run the benchmarks on every pull request for a few reasons: + +- Pull requests from forks and from Dependabot do not have access to secrets - this means the data cannot be pushed to the other repository; +- I don't want to tie-up a lot of my GitHub Actions capacity running the benchmarks for all PRs given that most pull requests are unlikely + to change the performance characteristics of the code; +- If a regression is detected post-merge, I can easily investigate the cause after-the-fact and either fix-forward or revert the change. + +The only requirement over basic BenchmarkDotNet usage is that the benchmarks need to be run with the `--exporters json` option to +generate the benchmark results in JSON format. This is for the action to use to generate the summarised data for the dashboard. ### Visualising the Data -TODO +The final piece of the puzzle is [the dashboard to visualise the data][benchmarks-site]. I've been looking for a good excuse to try +writing something using [Blazor][blazor] for a while, but I've never had a good reason to do so that would have otherwise needed a +re-architecture of an existing web application of mine. This seemed like a great opportunity to give it a try and learn something new. + +As the dashboard is hosted in a GitHub Pages site, there's no back-end to the application, so a Blazor WebAssembly (WASM) application +is the only avenue open to developing a Blazor application in this context. + +I wouldn't consider myself a web developer (centering `div`s is always hard, somehow), but I found Blazor to basically be _"React with C#"_, +so given my comfort with C# and .NET development it was relatively easy to pick up once I got my head around a few new concepts +(the render cycle, etc.). The difference between the original HTML with embedded JavaScript and my new Blazor version is night and day. + +I was also able to use [.NET Aspire][dotnet-aspire] as a good source of inspiration and practices for writing Blazor applications as the +Aspire Dashboard is itself a Blazor application (albeit not Blazor WASM). It was also the source of inspiration I used for moving from +Chart.js to [Plotly][plotly] for the charts in the dashboard so that I could add error bars to the data points from the benchmarks. + +It was also an opportunity to look into [bUnit][bunit] for testing the dashboard. I won't go on a tangent about bUnit, other than to say +I was really impressed with how it plugged into the existing .NET test ecosystem I'm familiar with using [xunit][xunit]. It was really easy +for me to add unit tests for the components and pages and get good coverage of the codebase ([80%+][code-coverage]) with existing tools like +[coverlet][coverlet] and [ReportGenerator][report-generator] to publish to [codecov.io][codecov]. + +I was able to signficantly extend the original kernel of the dashboard idea from the github-action-benchmark action to include a number +of additional features that I wanted to be able to use. These included: + +- Viewing all data from a single HTML page, not matter the repository or branch; +- Being able to load the data using the GitHub API to support storing the data in a different repository; +- Support for GitHub Enterprise Server or internal/private repositories (we can build and deploy copies of the dashboard onto my + employer's GitHub Enterprise Server instance for teams to use internally); +- GitHub authentication using [device flow][github-device-flow] to increase the rate limits for the GitHub API and support the above; +- Deep-linking to specific repositories/branches/charts; +- Downloading the charts as images for use elsewhere (like in blog posts or GitHub issues). + +You can find the source code for the dashboard in the [martincostello/benchmarks-dashboard][benchmarks-dashboard] repository. +If you'd like to host your own version, you can either fork it and modify it to your needs and deploy from there, or you could use the +repository via a [Git submodule][git-submodule] in your own repository to host the dashboard in a subdirectory of your repository and +then customise the build process and change the configuration etc. before you deploy it. The submodule approach is what I've used to +deploy an orange-themed version of the dashboard for use in GitHub Enterprise Server at my employer for some internal repositories. + +#### The No-Cost Exception + +The device flow support is the one exception to the "no cost" rule for the solution. As a client-side application with no back-end, the +normal GitHub OAuth flow cannot be used to authenticate a user to obtain an access token for the GitHub API as it would expose the client +secret to the browser. The [device flow][github-device-flow] is a way to authenticate the user without needing a secret, but it does not +support CORS, so it's not possible to use it directly from a browser. To work around this, I added an endpoint to an existing API of mine +to proxy the device flow requests to GitHub with CORS support and then return the access token to the client. + +This doesn't cost me anything _extra_ as I already had a running piece of unrelated infrastructure that I could use for this purpose. If you +wanted to run this solution yourself with GitHub Enterprise Server, or private repositories, you would similarly need to deploy (or extend) +some infrastructure to proxy the device flow. + +Similarly, I added a custom domain to the GitHub Pages site, but this was again a cost I already had for my domain and DNS, so wasn't an +_additional_ cost. It's still possible to use the default GitHub Pages domain to host the site, you just don't get the custom/vanity URL +to serve it over. + +### The End Result + +With all the pieces in place, at a high-level the solution looks something like this: + +A sequence diagram showing how the application, data and dashboard repositories interact to render charts + +Which for the end-user (i.e. me) gives a nice interactive dashboard to visualise the results like this: + +A screenshot of the dashboard website showing two charts of time and memory consumption for a branch of a GitHub repository + +I've set up a demo repository ([martincostello/benchmarks-demo][benchmarks-demo]) that you can use as an inspiration for setting +up some Benchmark.NET benchmarks and then using a GitHub Actions workflow to run them and publish them to another repository. ## Concrete Results @@ -119,16 +235,24 @@ TODO [aspnetcore-benchmarks-code]: https://github.com/aspnet/Benchmarks "ASP.NET Core Benchmarks on GitHub" [aspnetcore-benchmarks-docs]: https://github.com/dotnet/aspnetcore/blob/main/docs/Benchmarks.md "ASP.NET Core Benchmarks on GitHub" [benchmarkdotnet]: https://github.com/dotnet/BenchmarkDotNet "The BenchmarkDotNet repository on GitHub" +[benchmarks-dashboard]: https://github.com/martincostello/benchmarks-dashboard "Benchmarks dashboard repository on GitHub" [benchmarks-data]: https://github.com/martincostello/benchmarks "Benchmarks data repository on GitHub" [benchmarks-demo]: https://github.com/martincostello/benchmarks-demo "Benchmarks demo repository on GitHub" [benchmarks-site]: https://benchmarks.martincostello.com/ "Benchmarks dashboard deployed to GitHub Pages" +[benchmarks-workflow]: https://github.com/martincostello/api/blob/main/.github/workflows/benchmark-ci.yml "GitHub Actions workflow to run the benchmarks" [blazor]: https://learn.microsoft.com/aspnet/core/blazor/ "ASP.NET Core Blazor" [bunit]: https://bunit.dev/ "bUnit: a testing library for Blazor components" -[chart-js]: https://www.chartjs.org "Chart.js Website" -[dotnet-aspire]: https://learn.microsoft.com/dotnet/aspire/ ".NET Aspire documentation" +[chart-js]: https://www.chartjs.org "Chart.js website" +[codecov]: https://about.codecov.io/ "Codecov website" +[code-coverage]: https://app.codecov.io/gh/martincostello/benchmarks-dashboard "Code coverage for the benchmarks dashboard" +[coverlet]: https://github.com/coverlet-coverage/coverlet "The Coverlet repository on GitHub" +[dotnet-aspire]: https://github.com/dotnet/aspire "The .NET Aspire repository on GitHub" [dotnet-automated-patching]: https://www.youtube.com/live/pOeT1otTi4M?si=9OEq-rm_DTopVNd1&t=172 "On .NET Live - Effortless .NET updates with GitHub Actions" [dotnet-performance]: https://github.com/dotnet/performance "The dotnet/performance repository on GitHub" +[git-submodule]: https://git-scm.com/book/en/v2/Git-Tools-Submodules "Git Tools - Submodules" +[github-actions]: https://github.com/features/actions "GitHub Actions" [github-actions-limits]: https://docs.github.com/en/actions/administering-github-actions/usage-limits-billing-and-administration "GitHub Actions usage limits, billing and administration" +[github-device-flow]: https://docs.github.com/apps/oauth-apps/building-oauth-apps/authorizing-oauth-apps#device-flow "GitHub Device Flow documentation" [github-pages]: https://pages.github.com/ "GitHub Pages" [nswag]: https://github.com/RicoSuter/NSwag "The NSwag repository on GitHub" [openapi-post]: https://blog.martincostello.com/whats-new-for-openapi-with-dotnet-9/ "What's New for OpenAPI with .NET 9" @@ -137,8 +261,10 @@ TODO [publisher-action]: https://github.com/martincostello/benchmarkdotnet-results-publisher "The benchmarkdotnet-results-publisher repository on GitHub" [publisher-inspiration]: https://github.com/benchmark-action/github-action-benchmark "The github-action-benchmark repository on GitHub" [regression-comment]: https://github.com/martincostello/project-euler/pull/335#issuecomment-2302688319 "Example of a comment on a pull request for a regression" +[report-generator]: https://github.com/danielpalme/ReportGenerator "The ReportGenerator repository on GitHub" [runtime-regression]: https://github.com/dotnet/runtime/issues/107869 "Performance regression with JsonObject creation by +70%" [swashbuckle]: https://github.com/domaindrivendev/Swashbuckle.AspNetCore "The Swashbuckle.AspNetCore repository on GitHub" +[xunit]: https://xunit.net/ "xUnit.net" From 8de10aa86eed4e75328f7658b3acd2d6b2d67842 Mon Sep 17 00:00:00 2001 From: martin-costello Date: Sun, 22 Sep 2024 16:44:18 +0100 Subject: [PATCH 5/8] More content Just the summary left. --- ...-continuous-benchmarks-on-a-budget.html.md | 72 +++++++++++++++++-- 1 file changed, 67 insertions(+), 5 deletions(-) diff --git a/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md b/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md index c8ddf4e..8268b80 100644 --- a/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md +++ b/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md @@ -203,6 +203,10 @@ Similarly, I added a custom domain to the GitHub Pages site, but this was again _additional_ cost. It's still possible to use the default GitHub Pages domain to host the site, you just don't get the custom/vanity URL to serve it over. +> ⚠ī¸ If you need to use device flow with non-public repositories hosted in GitHub.com, you should do so over a custom domain so that you +> can restrict the allowed hosts for CORS to your domain, as otherwise you would need to allow it for the entire GitHub Pages domain, or +> otherwise restrict it somehow (e.g. by referrer or IP address). + ### The End Result With all the pieces in place, at a high-level the solution looks something like this: @@ -224,7 +228,66 @@ up some Benchmark.NET benchmarks and then using a GitHub Actions workflow to run ## Concrete Results -TODO +So with this solution in place, what have I been able to achieve with it so far? + +First, the dashboard was incredibly useful to track the fixes for a number of performance improvements in the new ASP.NET Core OpenAPI +library. These are covered in more detail in [my previous blog post][openapi-post], but the dashboard was invaluable in tracking the +effect of the changes on the performance of the library over time as changes were made, particularly when ASP.NET Core 9 Release Candidate 1 +was released. + +The second concrete outcome from using the dashboard was the discovery of a performance regression in the .NET Runtime in .NET 9. + +With the release of .NET 9 RC1 on the 10th of September 2024, I updated a number of my own applications to use the new version of the runtime +as RC1 is the first preview of .NET 9 with "go-live" support. After updating a number of applications and deploying them to my "production" +environments, I took a look at the dashboard to review any changes in the performance of the applications. + +I expected a good number of the benchmarks to show that the time taken for the benchmarks had reduced and/or used less memory. This was the +case for the majority of the benchmarks, but there was one benchmark that bucked the trend and went in the wrong direction. + +Going back to the chart shown at the top of this blog post, you can see that the red line denoting memory usage has a noticeable, +and consistent, uptick a few commits ago: + +A chart showing a time series for performance and memory usage with an increase in memory usage in the most recent data points + +If we hover over the first data point in the uptick, we can see that the change is from the upgrade from .NET 8 to .NET 9 RC1: + +The above chart with a tooltip showing the Git commit associated with the increase in memory usage + +I hadn't spotted this regression previously as the benchmark data is something I'd started collecting relatively recently, and the trends +didn't go back far enough to show the regression at the time it was made through my testing of the .NET 9 pre-releases. It was only when +I merged the upgrade to `main` and the data I'd started collecting in that branch for .NET 8 was the difference apparent. + +The regression also escaped the regression comment functionality of the GitHub Action. The memory used compared to the previous commit was +~106% - this is lower than the default threshold of 200% (i.e. double, carried through from [github-action-benchmark][publisher-inspiration]) +to avoid noisy false positives from variance in the performance of the GitHub Actions runners. When I've been running these benchmarks +for a bit longer, I might revisit this threshold to see if it can be lowered (either by changing the config, or maybe the default itself) +to avoid missing such regressions in the future. In this case, it was manual review that spotted it, rather than anything automated. + +The [specific benchmark][regression-benchmark] calls [an endpoint][regression-endpoint] that as I use as the health endpoint in a number of +my applications for containers deployed to Azure App Service. The endpoint uses `JsonObject` to return a JSON payload that contains a number +of useful properties about the application, such as the Git commit it was built from, the version of .NET its running, etc. This isn't an area +I would have expected to see a regression, but also isn't on a critical path, so wouldn't have been particularly noticeable in usage of the +applications themselves. It also turned out not to be an anomaly, as the same endpoint is present in several of my applications copy-pasted, +and each one showed the same regression. + +I figured it would be worth raising the issue with the .NET team, so I created a more pared-down version of the benchmark. The original benchmark +is an "end-to-end" benchmark that calls the endpoint over HTTP, so I extracted the body of the endpoint into a separate method and then benchmarked +it in isolation. By itself, the same code showed the same regression, but without being compensated for by improvements elsewhere in the .NET +9 runtime and ASP.NET Core 9, the regression was relatively significant. Compared to .NET 8, the memory usage had increased by 70% and the time +taken to run the benchmark had increased by 90%. Ouch. + +I raised the issue with the .NET team, and they were able to identify the cause of the regression as part of +[adding suport for explicit ordering of the properties of `JsonObject`][json-ordering]: [dotnet/runtime#107869][runtime-regression]. +The issue was fixed just three days later, and will be included in release candidate 2 of .NET 9 in October. The team also added new benchmarks +to their existing suite to ensure that such a regression in this area doesn't slip by in the future. + +I think both these examples of otherwise unnoticed issues demonstrate the usefulness of having a continuous benchmarking solution in place! ## Summary @@ -254,18 +317,17 @@ TODO [github-actions-limits]: https://docs.github.com/en/actions/administering-github-actions/usage-limits-billing-and-administration "GitHub Actions usage limits, billing and administration" [github-device-flow]: https://docs.github.com/apps/oauth-apps/building-oauth-apps/authorizing-oauth-apps#device-flow "GitHub Device Flow documentation" [github-pages]: https://pages.github.com/ "GitHub Pages" +[json-ordering]: https://github.com/dotnet/core/blob/main/release-notes/9.0/preview/preview6/libraries.md#ordering-jsonobject-properties "Ordering JsonObject properties" [nswag]: https://github.com/RicoSuter/NSwag "The NSwag repository on GitHub" [openapi-post]: https://blog.martincostello.com/whats-new-for-openapi-with-dotnet-9/ "What's New for OpenAPI with .NET 9" [plotly]: https://plotly.com/javascript/ "Plotly JavaScript Open Source Graphing Library" [power-bi]: https://learn.microsoft.com/power-bi/fundamentals/power-bi-overview "What is Power BI?" [publisher-action]: https://github.com/martincostello/benchmarkdotnet-results-publisher "The benchmarkdotnet-results-publisher repository on GitHub" [publisher-inspiration]: https://github.com/benchmark-action/github-action-benchmark "The github-action-benchmark repository on GitHub" +[regression-benchmark]: https://github.com/martincostello/api/blob/28fc4e2a9267e98303ff896e5e3a1da292201d2b/tests/API.Benchmarks/ApiBenchmarks.cs#L42-L44 "The benchmark that regressed" [regression-comment]: https://github.com/martincostello/project-euler/pull/335#issuecomment-2302688319 "Example of a comment on a pull request for a regression" +[regression-endpoint]: https://github.com/martincostello/api/blob/28fc4e2a9267e98303ff896e5e3a1da292201d2b/src/API/ApiModule.cs#L85-L114 "The endpoint that regressed" [report-generator]: https://github.com/danielpalme/ReportGenerator "The ReportGenerator repository on GitHub" [runtime-regression]: https://github.com/dotnet/runtime/issues/107869 "Performance regression with JsonObject creation by +70%" [swashbuckle]: https://github.com/domaindrivendev/Swashbuckle.AspNetCore "The Swashbuckle.AspNetCore repository on GitHub" [xunit]: https://xunit.net/ "xUnit.net" - - From 2ed6e4f1409bc3de629db6072d431f95695a4585 Mon Sep 17 00:00:00 2001 From: Martin Costello Date: Sun, 22 Sep 2024 16:47:56 +0100 Subject: [PATCH 6/8] Fix line endings I think. From 21bc5255475e80a9658fbd6c05986020641abde9 Mon Sep 17 00:00:00 2001 From: Martin Costello Date: Sun, 22 Sep 2024 16:50:51 +0100 Subject: [PATCH 7/8] Fix size Fix incorrect image size from merge. --- source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md b/source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md index f55422d..f34c82b 100644 --- a/source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md +++ b/source/2024-09-09-whats-new-for-openapi-with-dotnet-9.html.md @@ -7,7 +7,7 @@ description: "A look at the new Microsoft.AspNetCore.OpenApi package in .NET 9 a image: "https://cdn.martincostello.com/blog_openapi.png" --- -The OpenAPI logo +The OpenAPI logo Developers in the .NET ecosystem have been writing APIs with ASP.NET and ASP.NET Core for years, and [OpenAPI][openapi] (nÊe Swagger) has been a popular choice for documenting those APIs. From 6729128247e8d6e614140f3cde15078a145cdae5 Mon Sep 17 00:00:00 2001 From: martin-costello Date: Sun, 22 Sep 2024 17:01:44 +0100 Subject: [PATCH 8/8] Add summary. Fin. --- ...4-09-23-continuous-benchmarks-on-a-budget.html.md | 12 +++++++++++- 1 file changed, 11 insertions(+), 1 deletion(-) diff --git a/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md b/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md index 8268b80..b259990 100644 --- a/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md +++ b/source/2024-09-23-continuous-benchmarks-on-a-budget.html.md @@ -291,7 +291,17 @@ I think both these examples of otherwise unnoticed issues demonstrate the useful ## Summary -TODO +In this post I've covered how I set up a continuous benchmarking solution using GitHub Actions, GitHub Pages and Blazor to run and visualise +the results of BenchmarkDotNet benchmarks without needing to spend any money on hardware, software or infrastructure. The solution is good +enough to provide a consistent relative view of the performance of the software I maintain over time, and to spot any regressions in their performance. + +I'm looking forward to see what changes, and any issues, this setup might reveal in 2025 and beyond once .NET 10 development kicks off. + +If you'd like to run your own copy of this solution, or if you have suggestions about how to improve or extend it, feel free to open an issue +in either the [action][publisher-action] or [dashboard][benchmarks-dashboard] repositories. I'd also be curious to hear about any other issues +you might find that you wouldn't have otherwise noticed if you adopt this approach for your own code projects. + +I hope you've found this post interesting and its given you some inspiration to add a similar capability to your own workflows. 💡 [acr-housekeeping]: https://github.com/martincostello/github-automation/blob/adf8d8b14b6b8ac7be8ca8f30614ac4dfb137642/.github/workflows/acr-housekeeping.yml "A GitHub Actions workflow to clean up ACR images" [aspnetcore-benchmarks-dashboard]: https://aka.ms/aspnet/benchmarks "ASP.NET Core Benchmarks Power BI Dashboard"