As we know, .NET Conf 2022 took place between 8-10 November. During the conf, exciting news and performance improvements around .NET 7 and C# 11 were mentioned. In this release, especially some topics got focused which support us to develop more “fast”, “lightweight” and easier “cloud-native” applications.
In this article, I will try to mention some news which I like.
NOTE: First, if you don’t have .NET 7 release yet, you can download it from here.
As in every .NET release, this time it has been also mentioned that this release has great performance improvements at a level that can be called major or even the top one. They have really performed great performance improvements! A significant part of these improvements have been done on the JIT side and I briefly want to mention about them.
As we know, JIT is responsible for converting and managing MSIL code to native code at runtime. In order for our applications to work efficiently, it performs many different optimizations by taking into account the environment/process in the background. As we might imagine, such just-in-time performance optimizations are time-consuming operations due to their nature and have some tradeoffs. For example, when JIT does not make optimizations or not fully let’s say, the start-up time of the application may increase, but its functionality/throughput may decrease. As we know, in order to reduce such tradeoffs, JIT started to use Tiered Compilation by default since .NET Core 3. Thus, in order to achieve better performance optimization, JIT can perform hot-swap operations at runtime by recompiling the related methods more than once by looking at their usage statistics of them instead of compiling the methods only once.
With .NET 7, these tradeoffs, that JIT tried to avoid, have been tried to be completely handled by making use of the On-stack replacement technique in addition to other performance improvements on the JIT side. Thus, it is ensured that JIT can perform optimizations not only between method invocations but even while the relevant method is running.
Apart from these, there are also other great points that their performances have been improved such as Threading, Networking, Collections, LINQ and etc. In short, by changing the framework to .NET 7, we will be able to achieve a nice performance gain by default.
First, I would like to start with the Native Ahead-of-time (AOT) topic that excites me. As we know, the .NET team has been working on Native AOT for a while, and they had announced that they would get it from experimental status to mainline development within .NET 7. So with this release, Native AOT is now officially with us for console applications and class libraries.
Native AOT briefly generates the relevant code as “native” in compile-time instead of run-time. In short, when publishing the application, it compiles the relevant IL code to native code based on the specified runtime. Thus, Native AOT applications don’t need JIT while running. In other words, we can run our Native AOT applications in environments which don’t have .NET runtime. Of course, although this capability was provided to us under different features before, such as “Ready to Run“, this concept has evolved to a better point with Native AOT.
Some benefits of Native AOT;
Of course, there are also some limitations of Native AOT.
Although it has limitations, for now, I’m sure it will come to a good point in the future.
Let’s create a console application with the target framework .NET 7 in order to perform a quick test. Then, let’s use the simple piece of code that checks whether the given input is a palindrome or not.
string? input = Console.ReadLine(); bool result = IsPalindrome(input); Console.WriteLine($"The input '{input}' is a palindrome: {result}"); Console.ReadLine(); static bool IsPalindrome(string? input) { if (string.IsNullOrEmpty(input)) { return false; } bool result = true; for (int i = 0; i < input.Length; i++) { if (input[i] != input[(input.Length - 1) - i]) { result = false; } } return result; }
Now, in order to compile this application natively, we need to add a property into the project file as below.
<PublishAot>true</PublishAot>
If we don’t want to add a property, we can also pass this parameter along with the publish command.
-p:PublishAot=true
Then, we will be able to compile the application natively by specifying a runtime identifier that we want. For example, we can use the “win-x64” identifier for Windows environment and the “linux-arm64” identifier for Linux.
NOTE: If we compile the application on Ubuntu 20.04, it only works in same version or higher. In short, we need to pay attention to the Linux version we use to compile.
Now let’s use the below Dockerfile to test it.
FROM mcr.microsoft.com/dotnet/sdk:7.0 AS build # Install NativeAOT build prerequisites RUN apt-get update \ && apt-get install -y --no-install-recommends \ clang zlib1g-dev WORKDIR /source COPY . . RUN dotnet publish -c release -r linux-x64 -o /app FROM debian:bullseye-slim WORKDIR /app COPY --from=build /app . ENTRYPOINT ["/app/NativeAOTTest"]
If we take a look at the image points, we will use the Debian-based “dotnet/sdk:7.0” image to compile and “debian:bullseye-slim” image as runtime which doesn’t contain .NET runtime in it.
In addition, before publishing the application on the Linux machine, we also need to have the following relevant package.
As we can see, the application runs perfectly within the runtime which doesn’t have the .NET runtime in it.
I think Native AOT can be very useful, especially for serverless solutions. Considering that execution durations and start-up (cold-start” times are important, we can gain good benefits at these points by using Native AOT.
Within the scope of this release, I have mentioned some topics got focused on, which support us to develop more efficient and fast cloud-native applications. In this context, “Built-in Container Support” support was also one of them.
Although it seems like a very small feature, I found it quite handy for doing something fast. Especially if you don’t need a custom-made Dockerfile, we will be able to perform containerization operations with a parameter that we will specify while publishing our application.
For this process, we only need to add the following package via NuGet.
dotnet add package Microsoft.NET.Build.Containers
Then, when we perform the publish operation as follows, the container image will be created automatically.
dotnet publish --os linux --arch x64 -c Release -p:PublishProfile=DefaultContainer
I performed a test for a .NET 7 Web API project as follows.
dotnet new webapi -n my-test-api
As we can see the container has been successfully created.
As the base image, Debian-based Linux images are used by default. If we want to use a different distribution, we can do this by specifying the “ContainerBaseImage” property as follows.
<ContainerBaseImage>mcr.microsoft.com/dotnet/aspnet:7.0-alpine</ContainerBaseImage>
Also “AssemblyName” is used as the container name by default and the “Version” property as the tag.
If we want, we can also change them as follows.
<ContainerImageName>my-app</ContainerImageName> <Version>1.2.3-alpha2</Version>
As a limitation, it only supports Linux-based containers for now.
As we know, the easiest method we can use for in-memory caching in ASP.NET Core is to use IMemoryCache.
With .NET 7, a new API also has been added to IMemoryCache for metrics support. Now with MemoryCacheStatistics, we will be able to access some information such as the estimated size of the cache and how the cache is getting used and etc.
I guess accessing the metrics information about the in-memory cache of the application and taking some actions accordingly, will be beneficial for the health of the application.
In order to access these metrics information, we need to call the “GetCurrentStatistics()” method over IMemoryCache. Also, in order to track these metrics, we can either use EventCounters API and dotnet-counters tool or we can use the .NET metrics API and OpenTelemetry.
[ApiController] [Route("[controller]")] public class WeatherForecastController : ControllerBase { private readonly IMemoryCache _memoryCache; public WeatherForecastController(IMemoryCache memoryCache) { _memoryCache = memoryCache; } [HttpGet("stats")] public ActionResult<MemoryCacheStatistics> GetStats() { return Ok(_memoryCache.GetCurrentStatistics()); } }
Also when adding IMemoryCache to the service collection, we need to set the “TrackStatistics” parameter to “true”.
builder.Services.AddMemoryCache(c => c.TrackStatistics = true);
Although it is not a big feature, I like the idea to manage versions of common NuGet packages used by multiple projects from a central location.
For this, we need to create a file called Directory.Packages.props in the root folder of the relevant solution and define the packages we want as follows.
<Project> <PropertyGroup> <ManagePackageVersionsCentrally>true</ManagePackageVersionsCentrally> </PropertyGroup> <ItemGroup> <PackageVersionInclude="Newtonsoft.Json"Version="13.0.1"/> </ItemGroup> </Project>
Then, it will be sufficient to add the name of the relevant package as a reference in the project file we want.
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net7.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Newtonsoft.Json" />
</ItemGroup>
</Project>
With the “required” keyword which comes with C# 11, parameter null checking feature has become very useful.
public class MyClass { public required string MyRequiredParam { get; init; } public string? MyOptionalParam { get; init; } }
As we can see above, when we use the “required” keyword, it will become mandatory to set the “MyRequiredParam” parameter while initializing the related class.
I have always had a special interest in the actor-model and I follow the Orleans Project closely in this regard. I have written a few different articles and a seminar about Orleans before.
If you want to check, you can reach out here.
As part of .NET 7, great performance improvements have also been done on the Orleans side. There are some improvements on the immutability side and there is a new serialization. Orleans was already a great and high-performance tool when I was working on it a couple of years ago. I had developed a few different applications by using it. Now I’m wondering how fast it is now.
As we know, SQL Server’s JSON columns support has been with us for a long time. Now we also have JSON columns support on the EF side with this release. Now, with LINQ, we will be able to perform queries and different operations on JSON aggregates on the SQL Server side.
For example, let’s suppose we have a schema like below.
public class Product { public string Name { get; set; } public string Description { get; set; } public Price PriceDetails { get; set; } } public class Price { public decimal List { get; set; } public decimal Retail { get; set; } }
At this point, we want to keep PriceDetails information as a JSON column. In short, in order to map PriceDetails as a JSON column, the only thing that we need to do is, to call “ToJson()” method during the model configuration as follows.
protected override void OnModelCreating(ModelBuilder modelBuilder) { modelBuilder.Entity<Product>().OwnsOne( product => product.Price, navigation => { navigation.ToJson(); }); }
The rest is up to our LINQ skills.
On the bulk operations side, two new methods have been introduced “ExecuteUpdateAsync” and “ExecuteDeleteAsync”. By using these methods, we will be able to perform bulk operations with LINQ.
await context.Tags.Where(t => t.Posts.All(e => e.PublishedOn < priorToDateTime)).ExecuteDeleteAsync(); await context.Tags .Where(t => t.Posts.All(e => e.PublishedOn < priorToDateTime)) .ExecuteUpdateAsync(s => s.SetProperty(t => t.Text, t => t.Text + " (old)"));
It is very nice that they are brought into EF instead of using different EF extensions to perform such bulk operations.
Lastly, I would like to also mention the new “Rate-Limiting” middleware at the ASP.NET Core.
As we know, it is actually an important topic that the APIs, we have developed, have rate-limiting. Because it ensures that relevant API doesn’t get overwhelmed and its performance doesn’t decrease also it provides some kind of security mechanism against attacks such as DoS. Of course, especially if we are developing publicly accessible APIs.
It is a very simple-to-use middleware and comes with 4 different rate-limiting policies as “Fixed window”, “Sliding window”, “Token bucket” and “Concurrency”. We can also attach them at the endpoint level that we want.
var builder = WebApplication.CreateBuilder(args); builder.Services.AddRateLimiter(rateLimitingoptions => rateLimitingoptions.AddFixedWindowLimiter(policyName: "fixed", options => { options.PermitLimit = 100; options.Window = TimeSpan.FromSeconds(10); options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst; options.QueueLimit = 2; }));
For example, a “Fixed window” policy allows a maximum of “100” requests within “10” second window.
Then we can make it active or passive at any level we want.
app.MapControllers().RequireRateLimiting("fixed");
[ApiController] [Route("[controller]")] [EnableRateLimiting("fixed")] public class WeatherForecastController : ControllerBase { private readonly IMemoryCache _memoryCache; public WeatherForecastController(IMemoryCache memoryCache) { _memoryCache = memoryCache; } [HttpGet("stats")] [DisableRateLimiting] public ActionResult<MemoryCacheStatistics> GetStats() { return Ok(_memoryCache.GetCurrentStatistics()); } }
Within the scope of this article, I have tried to mention some news and improvements that I like at first glance. This release also didn’t surprise me because within each release there are always great achievements, improvements and performance increases. In short, nice job!
Especially the improvements, which are on the CLR side, are really great. I’m also very curious about where Native AOT will go. Of course, in addition to this news that I have mentioned, many different new features and improvements have been also made which I didn’t mention here. For example, there are many different news and improvements such as loop and reflection performance optimizations, the newly added Archive Tar API and etc.
If you have also different points that you like, I’m waiting for your comments.
https://devblogs.microsoft.com/dotnet/announcing-dotnet-7/
https://devblogs.microsoft.com/dotnet/performance_improvements_in_net_7/
https://devblogs.microsoft.com/dotnet/announcing-builtin-container-support-for-the-dotnet-sdk/
https://devblogs.microsoft.com/dotnet/whats-new-in-orleans-7/
https://learn.microsoft.com/en-gb/aspnet/core/performance/rate-limit?view=aspnetcore-7.0
{:tr} Makalenin ilk bölümünde, Software Supply Chain güvenliğinin öneminden ve containerized uygulamaların güvenlik risklerini azaltabilmek…
{:tr}Bildiğimiz gibi modern yazılım geliştirme ortamında containerization'ın benimsenmesi, uygulamaların oluşturulma ve dağıtılma şekillerini oldukça değiştirdi.…
{:tr}Bildiğimiz gibi bir ürün geliştirirken olabildiğince farklı cloud çözümlerinden faydalanmak, harcanacak zaman ve karmaşıklığın yanı…
{:tr}Bazen bazı senaryolar vardır karmaşıklığını veya eksi yanlarını bildiğimiz halde implemente etmekten kaçamadığımız veya implemente…
{:tr}Bildiğimiz gibi microservice architecture'ına adapte olmanın bir çok artı noktası olduğu gibi, maalesef getirdiği bazı…
{:tr}Bir önceki makale serisinde Dapr projesinden ve faydalarından bahsedip, local ortamda self-hosted mode olarak .NET…
View Comments
Selamlar Gökhan,
Maalesef .Net 7.0'da "Parameters null check" halen desteklenmiyor.
Örnek : public static void GetPersonMiddleName(string? middleName!!) => Bu hata verecektir.
İyi çalışmalar.
Ah haklısın abi, ne güzel preview versiyonunda vardı, hoş da bi özellikti açıkçası. Bazen bazı makaleleri önceden template hazır tutmak iyi olmuyormuş demekki :( Bunuda not şuraya bırakalım. İlgilenen olursa. https://github.com/dotnet/csharplang/blob/main/meetings/2022/LDM-2022-04-06.md#parameter-null-checking
teşekkürler üstat emeğin için çok güzel yenilikler oluyor bu aralar. takip etmesi gerçekten çok zor
güzel bir özet olmuş. teşekkürler...
Ben teşekkür ederim değerli yorumunuz için.