ASP.NET Core 8 + IIS + EF Core: Proven Memory Management Practices for High‑Performance APIs
When you’ve been building and running APIs for years, you learn that memory management isn’t just about avoiding crashes — it’s about keeping performance predictable, costs under control, and your ops team happy.
ASP.NET Core 8, hosted on IIS and backed by EF Core, is a powerful stack. But like any high‑performance engine, it needs tuning. Below is my go‑to checklist for keeping memory usage in check, based on lessons learned from production workloads.
1️⃣ IIS & Hosting‑Level Practices
Enable Server GC
ASP.NET Core on IIS uses Server GC by default, but I always verify:
$env:DOTNET_GCServer
Make sure it’s 1
. Server GC is optimized for throughput and multi‑core scalability.
Set IIS Recycling Rules
Don’t wait for an OutOfMemoryException to take you down. Set a Private Memory limit (KB) so IIS can recycle gracefully before trouble hits.
Disable Synchronous I/O
This one catches people out. In Program.cs
:
builder.WebHost.ConfigureKestrel(o => o.AllowSynchronousIO = false);
It prevents accidental large in‑memory buffering from sync reads/writes.
Compression & Buffering
Enable ResponseCompression
middleware for bandwidth savings, but avoid buffering huge responses in memory — stream them instead.
2️⃣ EF Core‑Specific Memory Discipline
No Unbounded ToList()
I’ve seen APIs grind to a halt because someone pulled millions of rows into memory. Always page:
var page = await _db.Users
.OrderBy(u => u.Id)
.Skip(pageIndex * pageSize)
.Take(pageSize)
.ToListAsync();
Use AsNoTracking()
for Read‑Only Queries
Skip change tracking when you don’t need it:
var data = await _db.Products.AsNoTracking().ToListAsync();
This alone can cut memory usage significantly.
Project Early
Select only the fields you need into DTOs instead of loading full entities.
Stream Large Reads
Process rows as they arrive:
await foreach (var item in _db.Logs.AsNoTracking().AsAsyncEnumerable())
{
// process item
}
Dispose DbContext Promptly
ASP.NET Core disposes scoped DbContexts per request but never hold them in static fields or long‑lived services.
3️⃣ API Payload & Serialization
System.Text.Json Source Generation
For hot DTOs, source‑generated serializers reduce allocations:
[JsonSerializable(typeof(MyDto))]
public partial class MyJsonContext : JsonSerializerContext {}
Stream Large JSON
Write directly to Response.Body
instead of building massive strings in memory.
Limit Request Body Size
Protect your API from oversized payloads:
builder.WebHost.ConfigureKestrel(o =>
{
o.Limits.MaxRequestBodySize = 10 * 1024 * 1024; // 10 MB
});
4️⃣ Caching & Object Reuse
Bound Your IMemoryCache
services.AddMemoryCache(o => o.SizeLimit = 256 * 1024 * 1024);
Eviction Policies
Always set size and expiration:
cache.Set(key, value, new MemoryCacheEntryOptions
{
Size = 1,
SlidingExpiration = TimeSpan.FromMinutes(5)
});
Pool Large Buffers
For file or stream processing:
var buffer = ArrayPool<byte>.Shared.Rent(8192);
// use buffer
ArrayPool<byte>.Shared.Return(buffer);
5️⃣ Diagnostics & Leak Prevention
Enable GC & Memory Metrics
Use dotnet-counters
or Application Insights to track:
Gen 2 GC count
LOH size
Allocation rate
Load Test with Realistic Data
Synthetic tests are fine for smoke checks, but real‑world payloads reveal the truth. Watch for steady memory growth — it’s a leak warning.
Common Leak Sources in ASP.NET Core APIs
Static collections holding request data
Event handlers never unsubscribed
Large logs or exception messages kept in memory
AsNoTracking()
with projection and pagination — I’ve seen this cut memory usage by 50–80% in busy APIs.Wrapping Up
Memory management in ASP.NET Core 8 with IIS and EF Core isn’t about one magic setting — it’s about layered discipline. From IIS configuration to EF Core query patterns, serialization choices, and caching strategies, each layer contributes to the overall footprint.
Get this right, and your API will run smoother, scale better, and cost less to operate.