diff --git a/Readme.md b/Readme.md index 4a15274..55c737f 100644 --- a/Readme.md +++ b/Readme.md @@ -1,21 +1,46 @@ -# Pandatech.EFCore.PostgresExtensions -Pandatech.EFCore.PostgresExtensions is a NuGet package that enhances Entity Framework Core with support for PostgreSQL-specific syntax for update operations. +- [1. Pandatech.EFCore.PostgresExtensions](#1-pandatechefcorepostgresextensions) + - [1.1. Features](#11-features) + - [1.2. Installation](#12-installation) + - [1.3. Usage](#13-usage) + - [1.3.1. Row-Level Locking](#131-row-level-locking) + - [1.3.2. Npgsql COPY Integration](#132-npgsql-copy-integration) + - [1.3.2.1. Benchmarks](#1321-benchmarks) + - [1.3.2.1.1. General Benchmark Results](#13211-general-benchmark-results) + - [1.3.2.1.2. Detailed Benchmark Results](#13212-detailed-benchmark-results) + - [1.3.2.1.3. Efficiency Comparison](#13213-efficiency-comparison) + - [1.3.2.1.4. Additional Notes](#13214-additional-notes) + - [1.4. License](#14-license) -## Introduction -You can install the Pandatech.EFCore.PostgresExtensions NuGet package via the NuGet Package Manager UI or the Package Manager Console using the following command: +# 1. Pandatech.EFCore.PostgresExtensions + +Pandatech.EFCore.PostgresExtensions is an advanced NuGet package designed to enhance PostgreSQL functionalities within +Entity Framework Core, leveraging specific features not covered by the official Npgsql.EntityFrameworkCore.PostgreSQL +package. This package introduces optimized row-level locking mechanisms and an efficient, typed version of the +PostgreSQL COPY operation, adhering to EF Core syntax for seamless integration into your projects. + +## 1.1. Features + +1. **Row-Level Locking**: Implements the PostgreSQL `FOR UPDATE` feature, providing three lock + behaviors - `Wait`, `Skip`, and + `NoWait`, to facilitate advanced transaction control and concurrency management. +2. **Npgsql COPY Integration**: Offers a high-performance, typed interface for the PostgreSQL COPY command, allowing for + bulk data operations within the EF Core framework. This feature significantly enhances data insertion speeds and + efficiency. + +## 1.2. Installation + +To install Pandatech.EFCore.PostgresExtensions, use the following NuGet command: + +```bash Install-Package Pandatech.EFCore.PostgresExtensions +``` + +## 1.3. Usage -## Features -Adds support for PostgreSQL-specific update syntax. -Simplifies handling of update operations when working with PostgreSQL databases. +### 1.3.1. Row-Level Locking -## Installation -1. Install Pandatech.EFCore.PostgresExtensions Package -```Install-Package Pandatech.EFCore.PostgresExtensions``` - -2. Enable Query Locks +Configure your DbContext to use Npgsql and enable query locks: -Inside the AddDbContext or AddDbContextPool method, after calling UseNpgsql(), call the UseQueryLocks() method on the DbContextOptionsBuilder to enable query locks. ```csharp services.AddDbContext(options => { @@ -24,36 +49,82 @@ services.AddDbContext(options => }); ``` -## Usage -Use the provided ForUpdate extension method on IQueryable within your application to apply PostgreSQL-specific update syntax. +Within a transaction scope, apply the desired lock behavior using the `ForUpdate` extension method: + ```csharp -using Pandatech.EFCore.PostgresExtensions; -using Microsoft.EntityFrameworkCore; +using var transaction = _dbContext.Database.BeginTransaction(); +try +{ + var entityToUpdate = _dbContext.Entities + .Where(e => e.Id == id) + .ForUpdate(LockBehavior.NoWait) // Or use LockBehavior.Default (Wait)/ LockBehavior.SkipLocked + .FirstOrDefault(); -// Inside your service or repository method -using (var transaction = _dbContext.Database.BeginTransaction()) + // Perform updates on entityToUpdate + await _dbContext.SaveChangesAsync(); + transaction.Commit(); +} +catch (Exception ex) { - try - { - // Use the ForUpdate extension method on IQueryable inside the transaction scope - var entityToUpdate = _dbContext.Entities - .Where(e => e.Id == id) - .ForUpdate() - .FirstOrDefault(); + transaction.Rollback(); + // Handle exception +} +``` - // Perform updates on entityToUpdate +### 1.3.2. Npgsql COPY Integration - await _dbContext.SaveChangesAsync(); +For bulk data operations, use the `BulkInsert` or `BulkInsertAsync` extension methods: - transaction.Commit(); - } - catch (Exception ex) +```csharp +public async Task BulkInsertExampleAsync() +{ + var users = new List(); + for (int i = 0; i < 10000; i++) { - transaction.Rollback(); - // Handle exception + users.Add(new UserEntity { /* Initialization */ }); } + + await dbContext.Users.BulkInsertAsync(users); // Or use BulkInsert for synchronous operation + // It also saves changes to the database } ``` -## License + +#### 1.3.2.1. Benchmarks + +The integration of the Npgsql COPY command showcases significant performance improvements compared to traditional EF +Core and Dapper methods: + +##### 1.3.2.1.1. General Benchmark Results + +| Caption | Big O Notation | 1M Rows | Batch Size | +|------------|----------------|-------------|------------| +| BulkInsert | O(log n) | 350.000 r/s | No batch | +| Dapper | O(n) | 20.000 r/s | 1500 | +| EFCore | O(n) | 10.600 r/s | 1500 | + +##### 1.3.2.1.2. Detailed Benchmark Results + +| Operation | BulkInsert | Dapper | EF Core | +|-------------|------------|--------|---------| +| Insert 10K | 76ms | 535ms | 884ms | +| Insert 100K | 405ms | 5.47s | 8.58s | +| Insert 1M | 2.87s | 55.85s | 94.57s | + +##### 1.3.2.1.3. Efficiency Comparison + +| RowsCount | BulkInsert Efficiency | Dapper Efficiency | +|-----------|----------------------------|---------------------------| +| 10K | 11.63x faster than EF Core | 1.65x faster than EF Core | +| 100K | 21.17x faster than EF Core | 1.57x faster than EF Core | +| 1M | 32.95x faster than EF Core | 1.69x faster than EF Core | + +##### 1.3.2.1.4. Additional Notes + +- The `BulkInsert` feature currently does not support entity properties intended for `JSON` storage. + +- The performance metrics provided above are based on benchmarks conducted under controlled conditions. Real-world + performance may vary based on specific use cases and configurations. + +## 1.4. License Pandatech.EFCore.PostgresExtensions is licensed under the MIT License. diff --git a/img.png b/img.png new file mode 100644 index 0000000..c126bd8 Binary files /dev/null and b/img.png differ diff --git a/src/EFCore.PostgresExtensions/EFCore.PostgresExtensions.csproj b/src/EFCore.PostgresExtensions/EFCore.PostgresExtensions.csproj index 71463c9..19da96c 100644 --- a/src/EFCore.PostgresExtensions/EFCore.PostgresExtensions.csproj +++ b/src/EFCore.PostgresExtensions/EFCore.PostgresExtensions.csproj @@ -8,22 +8,22 @@ Readme.md Pandatech MIT - 1.0.0 + 2.0.0 Pandatech.EFCore.PostgresExtensions Pandatech.EFCore.PostgresExtensions - Pandatech, library, EntityFrameworkCore, PostgreSQL, For Update, Lock, LockingSyntax - The Pandatech.EFCore.PostgresExtensions library enriches Entity Framework Core applications with advanced PostgreSQL functionalities, starting with the ForUpdate locking syntax. Designed for seamless integration, this NuGet package aims to enhance the efficiency and capabilities of EF Core models when working with PostgreSQL, with the potential for further PostgreSQL-specific extensions. + Pandatech, library, EntityFrameworkCore, PostgreSQL, For Update, Lock, LockingSyntax, Bulk insert, BinaryCopy + The Pandatech.EFCore.PostgresExtensions library enriches Entity Framework Core applications with advanced PostgreSQL functionalities, starting with the ForUpdate locking syntax and BulkInsert function. Designed for seamless integration, this NuGet package aims to enhance the efficiency and capabilities of EF Core models when working with PostgreSQL, with the potential for further PostgreSQL-specific extensions. https://github.com/PandaTechAM/be-lib-efcore-postgres-extensions - InitialCommit + Npgsql copy feature - - + + - + diff --git a/src/EFCore.PostgresExtensions/Extensions/BulkInsertExtension/BulkInsertExtensionSync.cs b/src/EFCore.PostgresExtensions/Extensions/BulkInsertExtension/BulkInsertExtensionSync.cs new file mode 100644 index 0000000..849a59b --- /dev/null +++ b/src/EFCore.PostgresExtensions/Extensions/BulkInsertExtension/BulkInsertExtensionSync.cs @@ -0,0 +1,159 @@ +using System.Collections; +using System.Diagnostics; +using System.Reflection; +using Microsoft.EntityFrameworkCore; +using Microsoft.EntityFrameworkCore.Metadata; +using Microsoft.Extensions.Logging; +using Npgsql; +using Npgsql.EntityFrameworkCore.PostgreSQL.Storage.Internal.Mapping; + +namespace EFCore.PostgresExtensions.Extensions.BulkInsertExtension; + +public static class BulkInsertExtension +{ + public static ILogger? Logger { get; set; } + + public static async Task BulkInsertAsync(this DbSet dbSet, List entities, + bool pkGeneratedByDb = true) where T : class + { + var context = PrepareBulkInsertOperation(dbSet, entities, pkGeneratedByDb, out var sp, out var properties, + out var columnCount, out var sql, out var propertyInfos, out var propertyTypes); + + var connection = new NpgsqlConnection(context.Database.GetConnectionString()); + await connection.OpenAsync(); + + await using var writer = await connection.BeginBinaryImportAsync(sql); + + for (var entity = 0; entity < entities.Count; entity++) + { + var item = entities[entity]; + var values = propertyInfos.Select(property => property!.GetValue(item)).ToList(); + + ConvertEnumValue(columnCount, propertyTypes, properties, values); + + await writer.StartRowAsync(); + + for (var i = 0; i < columnCount; i++) + { + await writer.WriteAsync(values[i]); + } + } + + await writer.CompleteAsync(); + await connection.CloseAsync(); + sp.Stop(); + + Logger?.LogInformation("Binary copy completed successfully. Total time: {Milliseconds} ms", + sp.ElapsedMilliseconds); + } + + public static void BulkInsert(this DbSet dbSet, List entities, + bool pkGeneratedByDb = true) where T : class + { + var context = PrepareBulkInsertOperation(dbSet, entities, pkGeneratedByDb, out var sp, out var properties, + out var columnCount, out var sql, out var propertyInfos, out var propertyTypes); + + var connection = new NpgsqlConnection(context.Database.GetConnectionString()); + connection.Open(); + + using var writer = connection.BeginBinaryImport(sql); + + for (var entity = 0; entity < entities.Count; entity++) + { + var item = entities[entity]; + var values = propertyInfos.Select(property => property!.GetValue(item)).ToList(); + + ConvertEnumValue(columnCount, propertyTypes, properties, values); + + writer.StartRow(); + + for (var i = 0; i < columnCount; i++) + { + writer.Write(values[i]); + } + } + + writer.Complete(); + connection.Close(); + sp.Stop(); + + Logger?.LogInformation("Binary copy completed successfully. Total time: {Milliseconds} ms", + sp.ElapsedMilliseconds); + } + + private static void ConvertEnumValue(int columnCount, IReadOnlyList propertyTypes, + IReadOnlyList properties, IList values) where T : class + { + for (var i = 0; i < columnCount; i++) + { + if (propertyTypes[i].IsEnum) + { + values[i] = Convert.ChangeType(values[i], Enum.GetUnderlyingType(propertyTypes[i])); + continue; + } + + // Check for generic types, specifically lists, and ensure the generic type is an enum + if (!propertyTypes[i].IsGenericType || propertyTypes[i].GetGenericTypeDefinition() != typeof(List<>) || + !propertyTypes[i].GetGenericArguments()[0].IsEnum) continue; + + var enumMapping = properties[i].FindTypeMapping(); + + // Only proceed if the mapping is for an array type, as expected for lists + if (enumMapping is not NpgsqlArrayTypeMapping) continue; + + var list = (IList)values[i]!; + var underlyingType = Enum.GetUnderlyingType(propertyTypes[i].GetGenericArguments()[0]); + + var convertedList = (from object item in list select Convert.ChangeType(item, underlyingType)).ToList(); + values[i] = convertedList; + } + } + + + private static DbContext PrepareBulkInsertOperation(DbSet dbSet, List entities, bool pkGeneratedByDb, + out Stopwatch sp, out List properties, out int columnCount, out string sql, + out List propertyInfos, out List propertyTypes) where T : class + { + sp = Stopwatch.StartNew(); + var context = dbSet.GetDbContext(); + + + if (entities == null || entities.Count == 0) + throw new ArgumentException("The model list cannot be null or empty."); + + if (context == null) throw new ArgumentNullException(nameof(context), "The DbContext instance cannot be null."); + + + var entityType = context.Model.FindEntityType(typeof(T))! ?? + throw new InvalidOperationException("Entity type not found."); + + var tableName = entityType.GetTableName() ?? + throw new InvalidOperationException("Table name is null or empty."); + + properties = entityType.GetProperties().ToList(); + + if (pkGeneratedByDb) + properties = properties.Where(x => !x.IsKey()).ToList(); + + var columnNames = properties.Select(x => $"\"{x.GetColumnName()}\"").ToList(); + + if (columnNames.Count == 0) + throw new InvalidOperationException("Column names are null or empty."); + + + columnCount = columnNames.Count; + var rowCount = entities.Count; + + Logger?.LogDebug( + "Column names found successfully. \n Total column count: {ColumnCount} \n Total row count: {RowCount}", + columnCount, rowCount); + + sql = $"COPY \"{tableName}\" ({string.Join(", ", columnNames)}) FROM STDIN (FORMAT BINARY)"; + + Logger?.LogInformation("SQL query created successfully. Sql query: {Sql}", sql); + + propertyInfos = properties.Select(x => x.PropertyInfo).ToList(); + propertyTypes = propertyInfos.Select(x => x!.PropertyType).ToList(); + return context; + } +} \ No newline at end of file diff --git a/src/EFCore.PostgresExtensions/Extensions/DbSetExtensions.cs b/src/EFCore.PostgresExtensions/Extensions/DbSetExtensions.cs new file mode 100644 index 0000000..ecd9398 --- /dev/null +++ b/src/EFCore.PostgresExtensions/Extensions/DbSetExtensions.cs @@ -0,0 +1,15 @@ +using Microsoft.EntityFrameworkCore; +using Microsoft.EntityFrameworkCore.Infrastructure; + +namespace EFCore.PostgresExtensions.Extensions; + +public static class DbSetExtensions +{ + public static DbContext GetDbContext(this DbSet dbSet) where T : class + { + var infrastructure = dbSet as IInfrastructure; + var serviceProvider = infrastructure.Instance; + var currentDbContext = serviceProvider.GetService(typeof(ICurrentDbContext)) as ICurrentDbContext; + return currentDbContext.Context; + } +} \ No newline at end of file diff --git a/test/PandaNuGet.Demo/Context/DatabaseExtensions.cs b/test/PandaNuGet.Demo/Context/DatabaseExtensions.cs new file mode 100644 index 0000000..f64d89e --- /dev/null +++ b/test/PandaNuGet.Demo/Context/DatabaseExtensions.cs @@ -0,0 +1,33 @@ +using Microsoft.EntityFrameworkCore; + +namespace PandaNuGet.Demo.Context; + +public static class DatabaseExtensions +{ + public static WebApplicationBuilder AddPostgresContext(this WebApplicationBuilder builder) + { + var configuration = builder.Configuration; + + var connectionString = configuration.GetConnectionString("Postgres"); + builder.Services.AddDbContextPool(options => + options.UseNpgsql(connectionString)); + return builder; + } + + public static WebApplication ResetDatabase(this WebApplication app) + { + using var scope = app.Services.CreateScope(); + var dbContext = scope.ServiceProvider.GetRequiredService(); + dbContext.Database.EnsureDeleted(); + dbContext.Database.EnsureCreated(); + return app; + } + + public static WebApplication MigrateDatabase(this WebApplication app) + { + using var scope = app.Services.CreateScope(); + var dbContext = scope.ServiceProvider.GetRequiredService(); + dbContext.Database.Migrate(); + return app; + } +} \ No newline at end of file diff --git a/test/PandaNuGet.Demo/Context/PostgresContext.cs b/test/PandaNuGet.Demo/Context/PostgresContext.cs new file mode 100644 index 0000000..9bc1cf8 --- /dev/null +++ b/test/PandaNuGet.Demo/Context/PostgresContext.cs @@ -0,0 +1,9 @@ +using Microsoft.EntityFrameworkCore; +using PandaNuGet.Demo.Entities; + +namespace PandaNuGet.Demo.Context; + +public class PostgresContext(DbContextOptions options) : DbContext(options) +{ + public DbSet Users { get; set; } = null!; +} \ No newline at end of file diff --git a/test/PandaNuGet.Demo/Dtos/BulkBenchmarkResponse.cs b/test/PandaNuGet.Demo/Dtos/BulkBenchmarkResponse.cs new file mode 100644 index 0000000..812b765 --- /dev/null +++ b/test/PandaNuGet.Demo/Dtos/BulkBenchmarkResponse.cs @@ -0,0 +1,13 @@ +using System.Text.Json.Serialization; + +namespace PandaNuGet.Demo.Dtos; + +public record BulkBenchmarkResponse(BenchmarkMethod Method, int RowsCount, string ElapsedMs); + +[JsonConverter(typeof(JsonStringEnumConverter))] +public enum BenchmarkMethod +{ + EFCore, + Dapper, + NpgsqlCopy +} \ No newline at end of file diff --git a/test/PandaNuGet.Demo/Entities/UserEntity.cs b/test/PandaNuGet.Demo/Entities/UserEntity.cs new file mode 100644 index 0000000..01c8d6d --- /dev/null +++ b/test/PandaNuGet.Demo/Entities/UserEntity.cs @@ -0,0 +1,28 @@ +using Microsoft.EntityFrameworkCore; + +namespace PandaNuGet.Demo.Entities; + +[PrimaryKey(nameof(Id))] +public class UserEntity +{ + public int Id { get; set; } + public Guid AlternateId { get; set; } = Guid.NewGuid(); + public string Name { get; set; } = "John Wick"; + public string? Address { get; set; } + public decimal Height { get; set; } = 1.85m; + public decimal? Weight { get; set; } + public DateTime BirthDate { get; set; } = DateTime.UtcNow; + public DateTime? DeathDate { get; set; } + public Status Status { get; set; } = Status.Active; + public bool IsMarried { get; set; } = true; + public bool? IsHappy { get; set; } + public string Description { get; set; } = "Some description to load the field with some data."; + public byte[] Image { get; set; } = [1, 2, 3, 4, 5]; + public byte[]? Document { get; set; } +} + +public enum Status +{ + Active, + Inactive +} \ No newline at end of file diff --git a/test/PandaNuGet.Demo/PandaNuGet.Demo.csproj b/test/PandaNuGet.Demo/PandaNuGet.Demo.csproj index f486c2f..a0dbbc1 100644 --- a/test/PandaNuGet.Demo/PandaNuGet.Demo.csproj +++ b/test/PandaNuGet.Demo/PandaNuGet.Demo.csproj @@ -8,7 +8,8 @@ - + + diff --git a/test/PandaNuGet.Demo/Program.cs b/test/PandaNuGet.Demo/Program.cs index c0f6005..417c4b3 100644 --- a/test/PandaNuGet.Demo/Program.cs +++ b/test/PandaNuGet.Demo/Program.cs @@ -1,14 +1,85 @@ +using PandaNuGet.Demo.Context; +using PandaNuGet.Demo.Dtos; +using PandaNuGet.Demo.Services; + var builder = WebApplication.CreateBuilder(args); +builder.AddPostgresContext(); +builder.Services.AddScoped(); builder.Services.AddEndpointsApiExplorer(); builder.Services.AddSwaggerGen(); -var app = builder.Build(); +var app = builder.Build(); +app.ResetDatabase(); app.UseSwagger(); app.UseSwaggerUI(); +app.MapGet("ping", () => "pong"); + +app.MapGet("/benchmark-sync/{minimumRows:int}", (BulkInsertService service, int minimumRows) => +{ + var results = new List + { + service.BulkInsertEfCore(minimumRows), + service.BulkInsertNpgsqlCopy(minimumRows), + service.BulkInsertDapper(minimumRows), + service.BulkInsertEfCore(minimumRows * 10), + service.BulkInsertDapper(minimumRows * 10), + service.BulkInsertNpgsqlCopy(minimumRows * 10), + service.BulkInsertEfCore(minimumRows * 100), + service.BulkInsertDapper(minimumRows * 100), + service.BulkInsertNpgsqlCopy(minimumRows * 100) + }; + + return results; +}); + +app.MapGet("/benchmark-async/{minimumRows:int}", async (BulkInsertService service, int minimumRows) => +{ + var results = new List + { + await service.BulkInsertEfCoreAsync(minimumRows), + await service.BulkInsertDapperAsync(minimumRows), + await service.BulkInsertNpgsqlCopyAsync(minimumRows), + await service.BulkInsertEfCoreAsync(minimumRows * 10), + await service.BulkInsertDapperAsync(minimumRows * 10), + await service.BulkInsertNpgsqlCopyAsync(minimumRows * 10), + await service.BulkInsertEfCoreAsync(minimumRows * 100), + await service.BulkInsertDapperAsync(minimumRows * 100), + await service.BulkInsertNpgsqlCopyAsync(minimumRows * 100) + }; + + return results; +}); + + +app.MapGet("/concurrency1", async (BulkInsertService service) => +{ + await service.BulkInsertEfCoreAsync(100000); +}); +app.MapGet("/concurrency4", async (BulkInsertService service) => +{ + await service.BulkInsertEfCoreAsync(100000, true); +}); +app.MapGet("/concurrency2", async (BulkInsertService service) => +{ + await service.BulkInsertDapperAsync(200000, true); +}); +app.MapGet("/concurrency3", async (BulkInsertService service) => +{ + await service.BulkInsertDapperAsync(200000, true); +}); +app.MapGet("/concurrency5", async (BulkInsertService service) => +{ + await service.BulkInsertNpgsqlCopyAsync(5_000_000, true); +}); +app.MapGet("/concurrency6", async (BulkInsertService service) => +{ + await service.BulkInsertNpgsqlCopyAsync(5_000_000, true); +}); + app.Run(); \ No newline at end of file diff --git a/test/PandaNuGet.Demo/Services/BulkInsertService.cs b/test/PandaNuGet.Demo/Services/BulkInsertService.cs new file mode 100644 index 0000000..7bdf3c3 --- /dev/null +++ b/test/PandaNuGet.Demo/Services/BulkInsertService.cs @@ -0,0 +1,232 @@ +using System.Diagnostics; +using System.Text; +using Dapper; +using EFCore.PostgresExtensions.Extensions.BulkInsertExtension; +using Microsoft.EntityFrameworkCore; +using PandaNuGet.Demo.Context; +using PandaNuGet.Demo.Dtos; +using PandaNuGet.Demo.Entities; + +namespace PandaNuGet.Demo.Services; + +public class BulkInsertService(PostgresContext dbContext) +{ + private const int BatchSize = 1500; + + public async Task BulkInsertEfCoreAsync(int rowsCount, bool ignoreReset = false) + { + await ResetDbAsync(ignoreReset); + List users = new(); + + for (int i = 0; i < rowsCount; i++) + { + users.Add(new UserEntity()); + } + + var stopwatch = Stopwatch.StartNew(); + dbContext.ChangeTracker.AutoDetectChangesEnabled = false; + + for (int i = 0; i < users.Count; i += BatchSize) + { + var batch = users.Skip(i).Take(BatchSize).ToList(); + await dbContext.Users.AddRangeAsync(batch); + await dbContext.SaveChangesAsync(); + } + + dbContext.ChangeTracker.Clear(); + stopwatch.Stop(); + + return new BulkBenchmarkResponse(BenchmarkMethod.EFCore, rowsCount, + stopwatch.ElapsedMilliseconds.ToString()); + } + + public BulkBenchmarkResponse BulkInsertEfCore(int rowsCount, bool ignoreReset = false) + { + ResetDb(ignoreReset); + + List users = new(); + + for (int i = 0; i < rowsCount; i++) + { + users.Add(new UserEntity()); + } + + var stopwatch = Stopwatch.StartNew(); + dbContext.ChangeTracker.AutoDetectChangesEnabled = false; + + for (int i = 0; i < users.Count; i += BatchSize) + { + var batch = users.Skip(i).Take(BatchSize).ToList(); + dbContext.Users.AddRange(batch); + dbContext.SaveChanges(); + } + + dbContext.ChangeTracker.Clear(); + stopwatch.Stop(); + + return new BulkBenchmarkResponse(BenchmarkMethod.EFCore, rowsCount, + stopwatch.ElapsedMilliseconds.ToString()); + } + + public async Task BulkInsertNpgsqlCopyAsync(int rowsCount, bool ignoreReset = false) + { + await ResetDbAsync(ignoreReset); + var users = new List(); + + for (int i = 0; i < rowsCount; i++) + { + users.Add(new UserEntity()); + } + + var stopwatch = Stopwatch.StartNew(); + await dbContext.Users.BulkInsertAsync(users); + stopwatch.Stop(); + + return new BulkBenchmarkResponse(BenchmarkMethod.NpgsqlCopy, rowsCount, + stopwatch.ElapsedMilliseconds.ToString()); + } + + public BulkBenchmarkResponse BulkInsertNpgsqlCopy(int rowsCount, bool ignoreReset = false) + { + ResetDb(ignoreReset); + var users = new List(); + + for (int i = 0; i < rowsCount; i++) + { + users.Add(new UserEntity()); + } + + var stopwatch = Stopwatch.StartNew(); + dbContext.Users.BulkInsert(users); + stopwatch.Stop(); + + return new BulkBenchmarkResponse(BenchmarkMethod.NpgsqlCopy, rowsCount, + stopwatch.ElapsedMilliseconds.ToString()); + } + + public BulkBenchmarkResponse BulkInsertDapper(int rowsCount, bool ignoreReset = false) + { + ResetDb(ignoreReset); + + var users = new List(); + for (int i = 0; i < rowsCount; i++) + { + users.Add(new UserEntity()); + } + + var stopwatch = Stopwatch.StartNew(); + + for (int batchStart = 0; batchStart < users.Count; batchStart += BatchSize) + { + var batchUsers = users.Skip(batchStart).Take(BatchSize); + var queryBuilder = new StringBuilder( + "INSERT INTO \"Users\" (\"AlternateId\", \"Name\", \"Address\", \"Height\", \"Weight\", \"BirthDate\", \"DeathDate\", \"Status\", \"IsMarried\", \"IsHappy\", \"Description\", \"Image\", \"Document\") VALUES "); + var parameters = new DynamicParameters(); + + int index = 0; + foreach (var user in batchUsers) + { + queryBuilder.Append( + $"(@AlternateId{index}, @Name{index}, @Address{index}, @Height{index}, @Weight{index}, @BirthDate{index}, @DeathDate{index}, @Status{index}, @IsMarried{index}, @IsHappy{index}, @Description{index}, @Image{index}, @Document{index}),"); + + parameters.Add($"@AlternateId{index}", user.AlternateId); + parameters.Add($"@Name{index}", user.Name); + parameters.Add($"@Address{index}", user.Address); + parameters.Add($"@Height{index}", user.Height); + parameters.Add($"@Weight{index}", user.Weight); + parameters.Add($"@BirthDate{index}", user.BirthDate); + parameters.Add($"@DeathDate{index}", user.DeathDate); + parameters.Add($"@Status{index}", user.Status); + parameters.Add($"@IsMarried{index}", user.IsMarried); + parameters.Add($"@IsHappy{index}", user.IsHappy); + parameters.Add($"@Description{index}", user.Description); + parameters.Add($"@Image{index}", user.Image); + parameters.Add($"@Document{index}", user.Document); + index++; + } + + queryBuilder.Length--; // Remove the last comma + + using var transaction = dbContext.Database.BeginTransaction(); + dbContext.Database.GetDbConnection().Execute(queryBuilder.ToString(), parameters); + transaction.Commit(); + } + + stopwatch.Stop(); + return new BulkBenchmarkResponse(BenchmarkMethod.Dapper, rowsCount, stopwatch.ElapsedMilliseconds.ToString()); + } + + public async Task BulkInsertDapperAsync(int rowsCount, bool ignoreReset = false) + { + await ResetDbAsync(ignoreReset); + + var users = new List(); + for (int i = 0; i < rowsCount; i++) + { + users.Add(new UserEntity()); + } + + var stopwatch = Stopwatch.StartNew(); + + for (int batchStart = 0; batchStart < users.Count; batchStart += BatchSize) + { + var batchUsers = users.Skip(batchStart).Take(BatchSize); + var queryBuilder = new StringBuilder( + "INSERT INTO \"Users\" (\"AlternateId\", \"Name\", \"Address\", \"Height\", \"Weight\", \"BirthDate\", \"DeathDate\", \"Status\", \"IsMarried\", \"IsHappy\", \"Description\", \"Image\", \"Document\") VALUES "); + var parameters = new DynamicParameters(); + + int index = 0; + foreach (var user in batchUsers) + { + queryBuilder.Append( + $"(@AlternateId{index}, @Name{index}, @Address{index}, @Height{index}, @Weight{index}, @BirthDate{index}, @DeathDate{index}, @Status{index}, @IsMarried{index}, @IsHappy{index}, @Description{index}, @Image{index}, @Document{index}),"); + + parameters.Add($"@AlternateId{index}", user.AlternateId); + parameters.Add($"@Name{index}", user.Name); + parameters.Add($"@Address{index}", user.Address); + parameters.Add($"@Height{index}", user.Height); + parameters.Add($"@Weight{index}", user.Weight); + parameters.Add($"@BirthDate{index}", user.BirthDate); + parameters.Add($"@DeathDate{index}", user.DeathDate); + parameters.Add($"@Status{index}", user.Status); + parameters.Add($"@IsMarried{index}", user.IsMarried); + parameters.Add($"@IsHappy{index}", user.IsHappy); + parameters.Add($"@Description{index}", user.Description); + parameters.Add($"@Image{index}", user.Image); + parameters.Add($"@Document{index}", user.Document); + index++; + } + + queryBuilder.Length--; // Remove the last comma + + await using var transaction = await dbContext.Database.BeginTransactionAsync(); + await dbContext.Database.GetDbConnection().ExecuteAsync(queryBuilder.ToString(), parameters); + await transaction.CommitAsync(); + } + + stopwatch.Stop(); + return new BulkBenchmarkResponse(BenchmarkMethod.Dapper, rowsCount, stopwatch.ElapsedMilliseconds.ToString()); + } + + private async Task ResetDbAsync(bool ignore) + { + if (ignore) + { + return; + } + + await dbContext.Database.EnsureDeletedAsync(); + await dbContext.Database.EnsureCreatedAsync(); + } + + private void ResetDb(bool ignore) + { + if (ignore) + { + return; + } + + dbContext.Database.EnsureDeleted(); + dbContext.Database.EnsureCreated(); + } +} \ No newline at end of file diff --git a/test/PandaNuGet.Demo/appsettings.Development.json b/test/PandaNuGet.Demo/appsettings.Development.json deleted file mode 100644 index 0c208ae..0000000 --- a/test/PandaNuGet.Demo/appsettings.Development.json +++ /dev/null @@ -1,8 +0,0 @@ -{ - "Logging": { - "LogLevel": { - "Default": "Information", - "Microsoft.AspNetCore": "Warning" - } - } -} diff --git a/test/PandaNuGet.Demo/appsettings.json b/test/PandaNuGet.Demo/appsettings.json index 10f68b8..f2af0c2 100644 --- a/test/PandaNuGet.Demo/appsettings.json +++ b/test/PandaNuGet.Demo/appsettings.json @@ -1,9 +1,12 @@ { "Logging": { "LogLevel": { - "Default": "Information", - "Microsoft.AspNetCore": "Warning" + "Default": "Error", + "Microsoft.AspNetCore": "Error" } }, - "AllowedHosts": "*" + "AllowedHosts": "*", + "ConnectionStrings": { + "Postgres": "Server=localhost;Port=5432;Database=postgres_extensions;User Id=test;Password=test;Pooling=true;" + } } diff --git a/test/PandaNuGet.Tests/PandaNuGet.Tests.csproj b/test/PandaNuGet.Tests/PandaNuGet.Tests.csproj index 5a64565..299c1f9 100644 --- a/test/PandaNuGet.Tests/PandaNuGet.Tests.csproj +++ b/test/PandaNuGet.Tests/PandaNuGet.Tests.csproj @@ -11,12 +11,12 @@ - - + + runtime; build; native; contentfiles; analyzers; buildtransitive all - + runtime; build; native; contentfiles; analyzers; buildtransitive all