This is a viewer only at the moment see the article on how this works.
To update the preview hit Ctrl-Alt-R (or ⌘-Alt-R on Mac) or Enter to refresh. The Save icon lets you save the markdown file to disk
This is a preview from the server running through my markdig pipeline
Wednesday, 03 December 2025
Welcome to Part 2 of our comprehensive guide to data access in .NET! In Part 1, we explored Entity Framework Core in depth, including SQL generation, common pitfalls, and that critical warning about proxies and caching.
In this article, we'll explore the lighter-weight alternatives and how to combine multiple approaches for optimal performance:
Dapper is a lightweight, high-performance micro-ORM created by Stack Overflow. It provides a thin layer over ADO.NET, handling the tedious work of mapping query results to objects while giving you full SQL control.
Dapper was born out of Stack Overflow's need for high-performance data access. The team found that full ORMs like Entity Framework (pre-Core) added too much overhead for their high-traffic scenarios. Dapper gives you 95% of the convenience with only 5-15% overhead over raw ADO.NET.
using Npgsql;
using Dapper;
public class DapperBlogRepository
{
private readonly string _connectionString;
public DapperBlogRepository(string connectionString)
{
_connectionString = connectionString;
// Configure Dapper to work with PostgreSQL naming conventions
DefaultTypeMap.MatchNamesWithUnderscores = true;
}
// Simple query
public async Task<IEnumerable<BlogPost>> GetRecentPostsAsync(int count)
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
SELECT id, title, content, tags, published_date, category_id
FROM blog_posts
ORDER BY published_date DESC
LIMIT @Count";
return await connection.QueryAsync<BlogPost>(sql, new { Count = count });
}
// Query with WHERE clause
public async Task<BlogPost> GetPostByIdAsync(int id)
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
SELECT id, title, content, published_date
FROM blog_posts
WHERE id = @Id";
return await connection.QueryFirstOrDefaultAsync<BlogPost>(sql, new { Id = id });
}
// Insert with returning ID
public async Task<int> CreatePostAsync(BlogPost post)
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
INSERT INTO blog_posts (title, content, published_date, category_id)
VALUES (@Title, @Content, @PublishedDate, @CategoryId)
RETURNING id";
return await connection.ExecuteScalarAsync<int>(sql, post);
}
// Update
public async Task UpdatePostAsync(BlogPost post)
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
UPDATE blog_posts
SET title = @Title,
content = @Content,
published_date = @PublishedDate
WHERE id = @Id";
await connection.ExecuteAsync(sql, post);
}
// Delete
public async Task DeletePostAsync(int id)
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = "DELETE FROM blog_posts WHERE id = @Id";
await connection.ExecuteAsync(sql, new { Id = id });
}
}
One of Dapper's most powerful features is multi-mapping - efficiently handling joins and mapping to multiple related objects:
public async Task<IEnumerable<BlogPost>> GetPostsWithCategoryAsync()
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
SELECT
p.id, p.title, p.content, p.published_date,
c.id, c.name, c.description
FROM blog_posts p
INNER JOIN categories c ON p.category_id = c.id
ORDER BY p.published_date DESC";
return await connection.QueryAsync<BlogPost, Category, BlogPost>(
sql,
(post, category) =>
{
post.Category = category;
return post;
},
splitOn: "id" // Split at the second "id" column
);
}
// More complex: Posts with comments
public async Task<IEnumerable<BlogPost>> GetPostsWithCommentsAsync()
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
SELECT
p.id, p.title, p.content,
c.id, c.author, c.content, c.created_at
FROM blog_posts p
LEFT JOIN comments c ON p.id = c.blog_post_id
ORDER BY p.published_date DESC, c.created_at";
var postDict = new Dictionary<int, BlogPost>();
await connection.QueryAsync<BlogPost, Comment, BlogPost>(
sql,
(post, comment) =>
{
if (!postDict.TryGetValue(post.Id, out var existingPost))
{
existingPost = post;
existingPost.Comments = new List<Comment>();
postDict.Add(post.Id, existingPost);
}
if (comment != null)
{
existingPost.Comments.Add(comment);
}
return existingPost;
},
splitOn: "id"
);
return postDict.Values;
}
Dynamic Parameters for Complex Queries:
public async Task<IEnumerable<BlogPost>> SearchWithDynamicFiltersAsync(SearchCriteria criteria)
{
using var connection = new NpgsqlConnection(_connectionString);
var parameters = new DynamicParameters();
var conditions = new List<string>();
var sql = new StringBuilder("SELECT * FROM blog_posts");
if (!string.IsNullOrEmpty(criteria.SearchTerm))
{
conditions.Add("search_vector @@ to_tsquery('english', @SearchTerm)");
parameters.Add("SearchTerm", criteria.SearchTerm);
}
if (criteria.CategoryIds?.Any() == true)
{
conditions.Add("category_id = ANY(@CategoryIds)");
parameters.Add("CategoryIds", criteria.CategoryIds);
}
if (criteria.FromDate.HasValue)
{
conditions.Add("published_date >= @FromDate");
parameters.Add("FromDate", criteria.FromDate.Value);
}
if (criteria.Tags?.Any() == true)
{
conditions.Add("tags && @Tags"); // PostgreSQL array overlap
parameters.Add("Tags", criteria.Tags);
}
if (conditions.Any())
{
sql.Append(" WHERE ");
sql.Append(string.Join(" AND ", conditions));
}
sql.Append(" ORDER BY published_date DESC LIMIT @Limit");
parameters.Add("Limit", criteria.Limit);
return await connection.QueryAsync<BlogPost>(sql.ToString(), parameters);
}
Custom Type Handlers for PostgreSQL Types:
// Handle PostgreSQL arrays
public class PostgresArrayTypeHandler : SqlMapper.TypeHandler<string[]>
{
public override void SetValue(IDbDataParameter parameter, string[] value)
{
parameter.Value = value;
((NpgsqlParameter)parameter).NpgsqlDbType = NpgsqlDbType.Array | NpgsqlDbType.Text;
}
public override string[] Parse(object value)
{
return (string[])value;
}
}
// Handle PostgreSQL JSONB
public class JsonTypeHandler<T> : SqlMapper.TypeHandler<T>
{
public override void SetValue(IDbDataParameter parameter, T value)
{
parameter.Value = JsonSerializer.Serialize(value);
((NpgsqlParameter)parameter).NpgsqlDbType = NpgsqlDbType.Jsonb;
}
public override T Parse(object value)
{
return JsonSerializer.Deserialize<T>(value.ToString());
}
}
// Register handlers (in startup)
SqlMapper.AddTypeHandler(new PostgresArrayTypeHandler());
SqlMapper.AddTypeHandler(new JsonTypeHandler<Dictionary<string, object>>());
Bulk Operations with PostgreSQL COPY:
public async Task BulkInsertPostsAsync(IEnumerable<BlogPost> posts)
{
using var connection = new NpgsqlConnection(_connectionString);
await connection.OpenAsync();
using var writer = await connection.BeginBinaryImportAsync(
"COPY blog_posts (title, content, tags, published_date) FROM STDIN (FORMAT BINARY)"
);
foreach (var post in posts)
{
await writer.StartRowAsync();
await writer.WriteAsync(post.Title);
await writer.WriteAsync(post.Content);
await writer.WriteAsync(post.Tags, NpgsqlDbType.Array | NpgsqlDbType.Text);
await writer.WriteAsync(post.PublishedDate);
}
await writer.CompleteAsync();
}
Transaction Support:
public async Task TransferPostToCategoryAsync(int postId, int newCategoryId)
{
using var connection = new NpgsqlConnection(_connectionString);
await connection.OpenAsync();
using var transaction = await connection.BeginTransactionAsync();
try
{
// Update the post
await connection.ExecuteAsync(
"UPDATE blog_posts SET category_id = @CategoryId WHERE id = @PostId",
new { CategoryId = newCategoryId, PostId = postId },
transaction
);
// Log the change
await connection.ExecuteAsync(
@"INSERT INTO category_history (post_id, category_id, changed_at)
VALUES (@PostId, @CategoryId, @ChangedAt)",
new { PostId = postId, CategoryId = newCategoryId, ChangedAt = DateTime.UtcNow },
transaction
);
await transaction.CommitAsync();
}
catch
{
await transaction.RollbackAsync();
throw;
}
}
✅ Use Dapper When:
❌ Avoid Dapper When:
For absolute maximum performance and control, you can use Npgsql directly without any ORM layer.
Raw ADO.NET is appropriate when:
public class NpgsqlBlogRepository
{
private readonly string _connectionString;
public async Task<List<BlogPost>> GetRecentPostsAsync(int count)
{
var posts = new List<BlogPost>();
using var connection = new NpgsqlConnection(_connectionString);
await connection.OpenAsync();
using var command = new NpgsqlCommand(
"SELECT id, title, content, tags, published_date FROM blog_posts ORDER BY published_date DESC LIMIT @count",
connection
);
command.Parameters.AddWithValue("count", count);
using var reader = await command.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
posts.Add(new BlogPost
{
Id = reader.GetInt32(0),
Title = reader.GetString(1),
Content = reader.GetString(2),
Tags = reader.GetFieldValue<string[]>(3),
PublishedDate = reader.GetDateTime(4)
});
}
return posts;
}
// Using prepared statements for repeated queries
public async Task<BlogPost> GetPostByIdAsync(int id)
{
using var connection = new NpgsqlConnection(_connectionString);
await connection.OpenAsync();
using var command = new NpgsqlCommand(
"SELECT id, title, content FROM blog_posts WHERE id = $1",
connection
);
command.Parameters.AddWithValue(id);
await command.PrepareAsync(); // Prepared statement for performance
using var reader = await command.ExecuteReaderAsync();
if (await reader.ReadAsync())
{
return new BlogPost
{
Id = reader.GetInt32(0),
Title = reader.GetString(1),
Content = reader.GetString(2)
};
}
return null;
}
// Working with PostgreSQL JSONB
public async Task<Dictionary<string, object>> GetPostMetadataAsync(int id)
{
using var connection = new NpgsqlConnection(_connectionString);
await connection.OpenAsync();
using var command = new NpgsqlCommand(
"SELECT metadata FROM blog_posts WHERE id = $1",
connection
);
command.Parameters.AddWithValue(id);
var json = await command.ExecuteScalarAsync() as string;
return JsonSerializer.Deserialize<Dictionary<string, object>>(json);
}
// Streaming large result sets
public async IAsyncEnumerable<BlogPost> StreamAllPostsAsync()
{
using var connection = new NpgsqlConnection(_connectionString);
await connection.OpenAsync();
using var command = new NpgsqlCommand(
"SELECT id, title, content FROM blog_posts ORDER BY id",
connection
);
using var reader = await command.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
yield return new BlogPost
{
Id = reader.GetInt32(0),
Title = reader.GetString(1),
Content = reader.GetString(2)
};
}
}
}
When working with Dapper or raw ADO.NET, you often need to map between different object representations (DTOs, entities, view models). Several libraries can automate this.
Mapster is a fast, convention-based object mapper that uses source generation for optimal performance.
// Install: Mapster and Mapster.Tool
using Mapster;
public class BlogPostDto
{
public int Id { get; set; }
public string Title { get; set; }
public string Summary { get; set; }
public List<string> CategoryNames { get; set; }
}
public class BlogPost
{
public int Id { get; set; }
public string Title { get; set; }
public string Content { get; set; }
public List<Category> Categories { get; set; }
}
// Configuration
public class MappingConfig : IRegister
{
public void Register(TypeAdapterConfig config)
{
config.NewConfig<BlogPost, BlogPostDto>()
.Map(dest => dest.Summary, src => src.Content.Substring(0, Math.Min(200, src.Content.Length)))
.Map(dest => dest.CategoryNames, src => src.Categories.Select(c => c.Name).ToList());
// Reverse map with ignore
config.NewConfig<BlogPostDto, BlogPost>()
.Ignore(dest => dest.Content);
}
}
// Registration in Program.cs
TypeAdapterConfig.GlobalSettings.Scan(Assembly.GetExecutingAssembly());
// Usage with Dapper
public class BlogService
{
private readonly string _connectionString;
public async Task<List<BlogPostDto>> GetPostsAsync()
{
using var connection = new NpgsqlConnection(_connectionString);
var posts = await connection.QueryAsync<BlogPost>(@"
SELECT p.id, p.title, p.content
FROM blog_posts p
");
// Map to DTOs - very fast with Mapster
return posts.Adapt<List<BlogPostDto>>();
}
// Projection mapping (compile-time)
public async Task<List<BlogPostDto>> GetPostsDtosDirectlyAsync()
{
using var connection = new NpgsqlConnection(_connectionString);
// Query directly to DTO shape
return (await connection.QueryAsync<BlogPostDto>(@"
SELECT
id,
title,
SUBSTRING(content, 1, 200) as summary
FROM blog_posts
")).ToList();
}
}
AutoMapper is the most popular mapping library, though slower than Mapster.
// Install: AutoMapper and AutoMapper.Extensions.Microsoft.DependencyInjection
using AutoMapper;
public class MappingProfile : Profile
{
public MappingProfile()
{
CreateMap<BlogPost, BlogPostDto>()
.ForMember(d => d.Summary, opt => opt.MapFrom(s =>
s.Content.Length > 200 ? s.Content.Substring(0, 200) : s.Content))
.ForMember(d => d.CategoryNames, opt => opt.MapFrom(s =>
s.Categories.Select(c => c.Name)));
// Reverse map
CreateMap<BlogPostDto, BlogPost>()
.ForMember(d => d.Content, opt => opt.Ignore());
}
}
// Registration in Program.cs
services.AddAutoMapper(typeof(MappingProfile));
// Usage
public class BlogService
{
private readonly IMapper _mapper;
private readonly string _connectionString;
public BlogService(IMapper mapper, IConfiguration configuration)
{
_mapper = mapper;
_connectionString = configuration.GetConnectionString("DefaultConnection");
}
public async Task<List<BlogPostDto>> GetPostsAsync()
{
using var connection = new NpgsqlConnection(_connectionString);
var posts = await connection.QueryAsync<BlogPost>(@"
SELECT id, title, content FROM blog_posts
");
return _mapper.Map<List<BlogPostDto>>(posts.ToList());
}
}
Sometimes the best approach is explicit manual mapping:
public static class BlogPostMapper
{
public static BlogPostDto ToDto(this BlogPost post)
{
return new BlogPostDto
{
Id = post.Id,
Title = post.Title,
Summary = post.Content.Length > 200
? post.Content.Substring(0, 200) + "..."
: post.Content,
CategoryNames = post.Categories?.Select(c => c.Name).ToList() ?? new List<string>()
};
}
public static List<BlogPostDto> ToDtoList(this IEnumerable<BlogPost> posts)
{
return posts.Select(p => p.ToDto()).ToList();
}
// Inline mapping for simple cases
public static BlogPostDto MapToDto(BlogPost post) => new()
{
Id = post.Id,
Title = post.Title,
Summary = post.Content[..Math.Min(200, post.Content.Length)]
};
}
// Usage
var posts = await _repository.GetAllPostsAsync();
var dtos = posts.ToDtoList();
BenchmarkDotNet Results (mapping 1000 objects):
Method | Mean | Allocated
--------------------|-----------|----------
Manual Mapping | 45.2 μs | 78 KB
Mapster | 52.1 μs | 79 KB
AutoMapper | 184.3 μs | 156 KB
Key Takeaways:
In real applications, you often want to use different approaches for different scenarios within the same application. This is the recommended approach for most production systems.
The CQRS (Command Query Responsibility Segregation) pattern is a natural fit for hybrid data access approaches. For a deeper dive into CQRS and event sourcing with Marten, see my article on Modern CQRS and Event Sourcing.
How Marten Relates to This Discussion:
Marten is a document database and event store built on PostgreSQL that takes hybrid data access to another level. It combines:
While this article focuses on traditional relational data access (EF Core, Dapper), Marten shows how you can leverage PostgreSQL's advanced features (JSONB, event streams) to implement sophisticated architectures. The principles are the same:
graph TB
Client[Client Application]
subgraph "Write Side - Commands"
WriteAPI[Write API / Commands]
EFCore[EF Core Context]
WriteDB[(PostgreSQL<br/>Write Operations)]
end
subgraph "Read Side - Queries"
ReadAPI[Read API / Queries]
Dapper[Dapper Repository]
ReadDB[(PostgreSQL<br/>Read Operations)]
end
Client -->|Create/Update/Delete| WriteAPI
WriteAPI --> EFCore
EFCore -->|Change Tracking<br/>Validation<br/>Business Logic| WriteDB
Client -->|Query/Search| ReadAPI
ReadAPI --> Dapper
Dapper -->|Optimized SQL<br/>DTOs<br/>No Tracking| ReadDB
WriteDB -.->|Same Database| ReadDB
style Client stroke:#6366f1,stroke-width:2px
style WriteAPI stroke:#2563eb,stroke-width:2px
style EFCore stroke:#2563eb,stroke-width:2px
style WriteDB stroke:#2563eb,stroke-width:2px
style ReadAPI stroke:#059669,stroke-width:2px
style Dapper stroke:#059669,stroke-width:2px
style ReadDB stroke:#059669,stroke-width:2px
This pattern leverages:
// Commands: Use EF Core for change tracking and validation
public class BlogCommandService
{
private readonly BlogDbContext _context;
private readonly ILogger<BlogCommandService> _logger;
public BlogCommandService(BlogDbContext context, ILogger<BlogCommandService> logger)
{
_context = context;
_logger = logger;
}
public async Task<int> CreatePostAsync(CreatePostCommand command)
{
// Business logic and validation
var post = new BlogPost
{
Title = command.Title,
Content = command.Content,
CategoryId = command.CategoryId,
PublishedDate = DateTime.UtcNow
};
_context.BlogPosts.Add(post);
await _context.SaveChangesAsync();
_logger.LogInformation("Created blog post {PostId}", post.Id);
return post.Id;
}
public async Task UpdatePostAsync(UpdatePostCommand command)
{
var post = await _context.BlogPosts.FindAsync(command.Id);
if (post == null)
throw new InvalidOperationException($"Post {command.Id} not found");
post.Title = command.Title;
post.Content = command.Content;
post.UpdatedAt = DateTime.UtcNow;
await _context.SaveChangesAsync();
_logger.LogInformation("Updated blog post {PostId}", post.Id);
}
public async Task DeletePostAsync(int id)
{
var post = await _context.BlogPosts.FindAsync(id);
if (post != null)
{
_context.BlogPosts.Remove(post);
await _context.SaveChangesAsync();
_logger.LogInformation("Deleted blog post {PostId}", id);
}
}
}
// Queries: Use Dapper for read performance
public class BlogQueryService
{
private readonly string _connectionString;
private readonly ILogger<BlogQueryService> _logger;
public BlogQueryService(IConfiguration configuration, ILogger<BlogQueryService> logger)
{
_connectionString = configuration.GetConnectionString("DefaultConnection");
_logger = logger;
}
public async Task<BlogPostDto> GetPostBySlugAsync(string slug)
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
SELECT
p.id,
p.title,
p.slug,
p.content,
p.published_date,
c.id as category_id,
c.name as category_name,
(SELECT COUNT(*) FROM comments WHERE blog_post_id = p.id) as comment_count
FROM blog_posts p
INNER JOIN categories c ON p.category_id = c.id
WHERE p.slug = @Slug";
var post = await connection.QueryFirstOrDefaultAsync<BlogPostDto>(sql, new { Slug = slug });
if (post != null)
{
_logger.LogInformation("Retrieved blog post by slug {Slug}", slug);
}
return post;
}
public async Task<PagedResult<BlogPostSummaryDto>> GetRecentPostsAsync(int page, int pageSize)
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
SELECT
p.id,
p.title,
p.slug,
LEFT(p.content, 200) as summary,
p.published_date,
c.name as category_name
FROM blog_posts p
INNER JOIN categories c ON p.category_id = c.id
ORDER BY p.published_date DESC
LIMIT @PageSize OFFSET @Offset";
const string countSql = "SELECT COUNT(*) FROM blog_posts";
var posts = await connection.QueryAsync<BlogPostSummaryDto>(
sql,
new { PageSize = pageSize, Offset = (page - 1) * pageSize }
);
var totalCount = await connection.ExecuteScalarAsync<int>(countSql);
return new PagedResult<BlogPostSummaryDto>
{
Items = posts.ToList(),
TotalCount = totalCount,
Page = page,
PageSize = pageSize
};
}
public async Task<List<BlogPostDto>> SearchPostsAsync(string searchTerm)
{
using var connection = new NpgsqlConnection(_connectionString);
const string sql = @"
SELECT
p.id,
p.title,
p.slug,
p.content,
p.published_date,
c.name as category_name,
ts_rank(p.search_vector, query) as relevance_score
FROM blog_posts p
INNER JOIN categories c ON p.category_id = c.id,
to_tsquery('english', @SearchTerm) query
WHERE p.search_vector @@ query
ORDER BY relevance_score DESC
LIMIT 50";
var posts = await connection.QueryAsync<BlogPostDto>(sql, new { SearchTerm = searchTerm });
_logger.LogInformation(
"Searched posts with term {SearchTerm}, found {Count} results",
searchTerm,
posts.Count()
);
return posts.ToList();
}
}
// Service layer orchestrating commands and queries
public class BlogService
{
private readonly BlogCommandService _commands;
private readonly BlogQueryService _queries;
public BlogService(BlogCommandService commands, BlogQueryService queries)
{
_commands = commands;
_queries = queries;
}
// Write operations delegate to command service
public Task<int> CreatePostAsync(CreatePostCommand command) => _commands.CreatePostAsync(command);
public Task UpdatePostAsync(UpdatePostCommand command) => _commands.UpdatePostAsync(command);
public Task DeletePostAsync(int id) => _commands.DeletePostAsync(id);
// Read operations delegate to query service
public Task<BlogPostDto> GetPostBySlugAsync(string slug) => _queries.GetPostBySlugAsync(slug);
public Task<PagedResult<BlogPostSummaryDto>> GetRecentPostsAsync(int page, int pageSize)
=> _queries.GetRecentPostsAsync(page, pageSize);
public Task<List<BlogPostDto>> SearchPostsAsync(string searchTerm)
=> _queries.SearchPostsAsync(searchTerm);
}
For applications that are primarily EF Core but need occasional performance optimization:
public class BlogService
{
private readonly BlogDbContext _context;
// 95% of queries: Use EF Core LINQ
public async Task<List<BlogPost>> GetPostsByCategoryAsync(int categoryId)
{
return await _context.BlogPosts
.Where(p => p.CategoryId == categoryId)
.Include(p => p.Comments)
.ToListAsync();
}
// 5% of queries: Use raw SQL for complex analytics
public async Task<List<PostAnalytics>> GetPostAnalyticsAsync()
{
using var connection = _context.Database.GetDbConnection();
await _context.Database.OpenConnectionAsync();
using var command = connection.CreateCommand();
command.CommandText = @"
WITH post_metrics AS (
SELECT
p.id,
p.title,
COUNT(DISTINCT c.id) as comment_count,
COUNT(DISTINCT v.id) as view_count,
AVG(c.sentiment_score) as avg_sentiment
FROM blog_posts p
LEFT JOIN comments c ON p.id = c.post_id
LEFT JOIN post_views v ON p.id = v.post_id
WHERE p.published_date >= NOW() - INTERVAL '30 days'
GROUP BY p.id, p.title
)
SELECT * FROM post_metrics
ORDER BY view_count DESC";
var analytics = new List<PostAnalytics>();
using var reader = await command.ExecuteReaderAsync();
while (await reader.ReadAsync())
{
analytics.Add(new PostAnalytics
{
PostId = reader.GetInt32(0),
Title = reader.GetString(1),
CommentCount = reader.GetInt64(2),
ViewCount = reader.GetInt64(3),
AverageSentiment = reader.IsDBNull(4) ? 0 : reader.GetDouble(4)
});
}
return analytics;
}
}
Let's look at real-world performance benchmarks for common operations with PostgreSQL:
BenchmarkDotNet Results (Lower is Better):
Method | Mean | Allocated
------------------------- |----------- |-----------
EF Core (No Tracking) | 12.34 ms | 2.4 MB
EF Core (With Tracking) | 15.67 ms | 4.8 MB
Dapper | 8.21 ms | 1.8 MB
Raw Npgsql | 7.45 ms | 1.2 MB
Method | Mean | Allocated
------------------------- |----------- |-----------
EF Core (SaveChanges) | 245.3 ms | 15.2 MB
EF Core (BulkInsert) | 42.1 ms | 8.4 MB
Dapper (Loop) | 189.7 ms | 2.1 MB
Npgsql COPY | 18.3 ms | 0.8 MB
Method | Mean | Allocated
------------------------- |----------- |-----------
EF Core (Include) | 28.5 ms | 5.2 MB
EF Core (Split Query) | 24.1 ms | 4.8 MB
Dapper (Multi-Map) | 16.8 ms | 3.1 MB
Raw Npgsql | 15.2 ms | 2.4 MB
See Part 1 for comprehensive EF Core guidance.
Choosing the right data access approach for your .NET application with PostgreSQL isn't about finding the "best" tool - it's about matching the right tool to your specific needs:
In practice, the most successful applications often use a hybrid approach, leveraging the strengths of each tool where appropriate:
The key is to:
Remember: premature optimization is the root of all evil, but so is building a system that can't scale when needed. Start simple, measure performance, and optimize where it matters.
Part 1 of This Series:
Official Documentation:
Related Articles on This Blog:
That concludes our two-part series on data access in .NET! We've covered everything from EF Core's powerful abstractions to raw SQL's maximum performance, with practical guidance on combining approaches for optimal results.
© 2025 Scott Galloway — Unlicense — All content and source code on this site is free to use, copy, modify, and sell.