Skip to content

feat(IZipArchiveService): add ArchiveAsync method#7373

Merged
ArgoZhang merged 7 commits intomainfrom
feat-zip
Dec 19, 2025
Merged

feat(IZipArchiveService): add ArchiveAsync method#7373
ArgoZhang merged 7 commits intomainfrom
feat-zip

Conversation

@ArgoZhang
Copy link
Copy Markdown
Member

@ArgoZhang ArgoZhang commented Dec 19, 2025

Link issues

fixes #7371

Summary By Copilot

Regression?

  • Yes
  • No

Risk

  • High
  • Medium
  • Low

Verification

  • Manual (required)
  • Automated

Packaging changes reviewed?

  • Yes
  • No
  • N/A

☑️ Self Check before Merge

⚠️ Please check all items below before review. ⚠️

  • Doc is updated/provided or not needed
  • Demo is updated/provided or not needed
  • Merge the latest code from the main branch

Summary by Sourcery

Introduce a more flexible zip archiving model based on archive entries and align the zip service API around asynchronous operations.

New Features:

  • Add an ArchiveEntry value type to describe archive items with source path, archive entry name, and optional compression level.
  • Support archiving using ArchiveEntry collections, allowing explicit control over entry names, directories, and per-entry compression settings.

Enhancements:

  • Rename and standardize zip service methods to async variants (e.g., ArchiveDirectoryAsync) and remove redundant overloads for a simpler API surface.
  • Update the default zip archive service implementation to handle both files and directories when building archives, honoring per-entry compression configuration.

Tests:

  • Adjust existing zip archive tests to use ArchiveEntry-based APIs and the async extraction method, and add new tests covering ZipArchive behavior and ArchiveAsync scenarios.

Copilot AI review requested due to automatic review settings December 19, 2025 08:24
@bb-auto bb-auto Bot added the enhancement New feature or request label Dec 19, 2025
@bb-auto bb-auto Bot added this to the v10.1.0 milestone Dec 19, 2025
@sourcery-ai
Copy link
Copy Markdown
Contributor

sourcery-ai Bot commented Dec 19, 2025

Reviewer's Guide

Refactors the zip archive service API to work with a new ArchiveEntry value type, consolidates directory/archive methods around async operations, and updates tests to cover entry-based archiving, directory handling, and async extraction.

Sequence diagram for ArchiveAsync with ArchiveEntry handling

sequenceDiagram
    participant Caller
    participant IZipArchiveService
    participant DefaultZipArchiveService
    participant ArchiveOptions
    participant ZipArchive
    participant FileSystem

    Caller->>IZipArchiveService: ArchiveAsync(entries, options)
    IZipArchiveService->>DefaultZipArchiveService: ArchiveAsync(entries, options)
    DefaultZipArchiveService->>DefaultZipArchiveService: create MemoryStream
    alt options is null
        DefaultZipArchiveService->>ArchiveOptions: new ArchiveOptions()
        DefaultZipArchiveService->>ArchiveOptions: set LeaveOpen = true
    else options provided
        DefaultZipArchiveService->>ArchiveOptions: set LeaveOpen = true
    end
    DefaultZipArchiveService->>ZipArchive: new ZipArchive(stream, options.Mode, options.LeaveOpen, options.Encoding)
    loop for each entry in entries
        alt options.ReadStreamAsync not null
            DefaultZipArchiveService->>ZipArchive: CreateEntry(entry.EntryName, options.CompressionLevel)
            DefaultZipArchiveService->>ArchiveOptions: ReadStreamAsync(entry.SourceFileName)
            ArchiveOptions-->>DefaultZipArchiveService: content Stream
            DefaultZipArchiveService->>ZipArchive: entry.Open()
            DefaultZipArchiveService->>ZipArchive: CopyToAsync(content, entryStream)
        else Directory.Exists(entry.SourceFileName)
            DefaultZipArchiveService->>FileSystem: Directory.Exists(entry.SourceFileName)
            alt EntryName not empty
                DefaultZipArchiveService->>ZipArchive: CreateEntry(entry.EntryName as folder, entry.CompressionLevel or options.CompressionLevel)
            end
        else File.Exists(entry.SourceFileName)
            DefaultZipArchiveService->>FileSystem: File.Exists(entry.SourceFileName)
            DefaultZipArchiveService->>ZipArchive: CreateEntryFromFile(entry.SourceFileName, entry.EntryName, entry.CompressionLevel or options.CompressionLevel)
        end
    end
    DefaultZipArchiveService-->>Caller: Stream (position = 0)
Loading

Class diagram for updated zip archive service API

classDiagram
    class IZipArchiveService {
        <<interface>>
        +Task~Stream~ ArchiveAsync(IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
        +Task ArchiveAsync(string archiveFile, IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
        +Task ArchiveDirectoryAsync(string archiveFile, string directoryName, CompressionLevel compressionLevel, bool includeBaseDirectory, Encoding encoding, CancellationToken token)
        +Task~bool~ ExtractToDirectoryAsync(string archiveFile, string destinationDirectoryName, bool overwriteFiles, Encoding encoding, CancellationToken token)
        +ZipArchiveEntry GetEntry(string archiveFile, string entryFile, bool overwriteFiles, Encoding encoding)
    }

    class DefaultZipArchiveService {
        +Task~Stream~ ArchiveAsync(IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
        +Task ArchiveAsync(string archiveFile, IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
        +Task ArchiveDirectoryAsync(string archiveFile, string directoryName, CompressionLevel compressionLevel, bool includeBaseDirectory, Encoding encoding, CancellationToken token)
        +Task~bool~ ExtractToDirectoryAsync(string archiveFile, string destinationDirectoryName, bool overwriteFiles, Encoding encoding, CancellationToken token)
        +ZipArchiveEntry GetEntry(string archiveFile, string entryFile, bool overwriteFiles, Encoding encoding)
        -static Task ArchiveFilesAsync(Stream stream, IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
    }

    class ArchiveEntry {
        <<record struct>>
        +string SourceFileName
        +string EntryName
        +CompressionLevel CompressionLevel
    }

    class ArchiveOptions {
        +ZipArchiveMode Mode
        +bool LeaveOpen
        +Encoding Encoding
        +CompressionLevel CompressionLevel
        +Func~string, Task~Stream~~ ReadStreamAsync
    }

    class ZipArchive {
    }

    class ZipArchiveEntry {
    }

    class CompressionLevel {
    }

    class ZipArchiveMode {
    }

    class Encoding {
    }

    class CancellationToken {
    }

    class Stream {
    }

    IZipArchiveService <|.. DefaultZipArchiveService
    DefaultZipArchiveService --> ArchiveEntry : uses
    DefaultZipArchiveService --> ArchiveOptions : uses
    DefaultZipArchiveService --> ZipArchive : uses
    ArchiveEntry --> CompressionLevel : optional
    ArchiveOptions --> CompressionLevel : uses
    ArchiveOptions --> ZipArchiveMode : uses
    ArchiveOptions --> Encoding : uses
    ArchiveOptions --> Stream : uses
    IZipArchiveService --> ArchiveEntry : API parameter
Loading

File-Level Changes

Change Details Files
Switch archive APIs from raw file paths to ArchiveEntry metadata objects and update archiving implementation accordingly.
  • Change IZipArchiveService.ArchiveAsync overloads to take IEnumerable instead of IEnumerable.
  • Update DefaultZipArchiveService.ArchiveAsync methods to pass ArchiveEntry collections into a revised ArchiveFilesAsync helper.
  • Rewrite ArchiveFilesAsync to create entries using ArchiveEntry.EntryName, use per-entry CompressionLevel when present, and distinguish between files and directories via SourceFileName.
src/BootstrapBlazor/Services/IZipArchiveService.cs
src/BootstrapBlazor/Services/DefaultZipArchiveService.cs
Simplify and modernize directory archiving and extraction APIs to be async-focused and remove redundant overloads.
  • Rename ArchiveDirectory to ArchiveDirectoryAsync in IZipArchiveService and DefaultZipArchiveService, updating call sites accordingly.
  • Remove the ArchiveDirectory overload that accepted arbitrary string entries and its AddFolderToZip helper, centralizing folder zipping logic in ArchiveDirectoryAsync.
  • Remove the synchronous ExtractToDirectory API from IZipArchiveService and DefaultZipArchiveService, leaving only ExtractToDirectoryAsync for directory extraction.
src/BootstrapBlazor/Services/IZipArchiveService.cs
src/BootstrapBlazor/Services/DefaultZipArchiveService.cs
test/UnitTest/Services/ZipArchiveServiceTest.cs
Introduce ArchiveEntry value type for describing archive items with custom names and compression settings.
  • Add a readonly record struct ArchiveEntry with SourceFileName, EntryName, and nullable CompressionLevel properties.
  • Use ArchiveEntry.EntryName instead of Path.GetFileName where callers want custom entry paths inside the archive.
  • Allow per-entry CompressionLevel overrides falling back to ArchiveOptions.CompressionLevel when null.
src/BootstrapBlazor/Services/ArchiveEntry.cs
src/BootstrapBlazor/Services/DefaultZipArchiveService.cs
test/UnitTest/Services/ZipArchiveServiceTest.cs
Extend and adjust unit tests to exercise the new ArchiveEntry-based API and async-only extract/directory operations.
  • Update existing Archive_Ok test to construct ArchiveEntry items from temporary files and to use the new ArchiveAsync signatures.
  • Change tests to use ExtractToDirectoryAsync instead of the removed synchronous ExtractToDirectory method.
  • Add ZipArchive_Ok to validate creating a directory entry plus file entry via System.IO.Compression.ZipArchive.
  • Add ArchiveAsync_Ok to verify per-entry custom paths, per-entry compression levels, and directory-as-entry behaviors when archiving to a file.
test/UnitTest/Services/ZipArchiveServiceTest.cs

Assessment against linked issues

Issue Objective Addressed Explanation
#7371 Expose ArchiveAsync methods on IZipArchiveService and implement them in DefaultZipArchiveService to provide asynchronous archive operations.

Possibly linked issues


Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

sourcery-ai[bot]
sourcery-ai Bot previously approved these changes Dec 19, 2025
Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there - I've reviewed your changes - here's some feedback:

  • In ArchiveFilesAsync, the ReadStreamAsync branch ignores ArchiveEntry.CompressionLevel and always uses options.CompressionLevel; consider honoring the per-entry compression level here for consistency with the non-async file/directory branches.
  • When creating directory entries you use Path.DirectorySeparatorChar in the zip entry name; for portability and interoperability of zip archives it would be safer to normalize to '/' regardless of OS path separator.
  • In the ReadStreamAsync branch you both await using the entryStream and explicitly call entryStream.Close(); the explicit Close() is redundant and can be removed to simplify the resource management.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- In `ArchiveFilesAsync`, the `ReadStreamAsync` branch ignores `ArchiveEntry.CompressionLevel` and always uses `options.CompressionLevel`; consider honoring the per-entry compression level here for consistency with the non-async file/directory branches.
- When creating directory entries you use `Path.DirectorySeparatorChar` in the zip entry name; for portability and interoperability of zip archives it would be safer to normalize to `'/'` regardless of OS path separator.
- In the `ReadStreamAsync` branch you both `await using` the `entryStream` and explicitly call `entryStream.Close()`; the explicit `Close()` is redundant and can be removed to simplify the resource management.

## Individual Comments

### Comment 1
<location> `src/BootstrapBlazor/Services/DefaultZipArchiveService.cs:41-48` </location>
<code_context>
-                var entry = archive.CreateEntry(Path.GetFileName(f), options.CompressionLevel);
-                using var entryStream = entry.Open();
-                await using var content = await options.ReadStreamAsync(f);
+                var entry = archive.CreateEntry(f.EntryName, options.CompressionLevel);
+                await using var content = await options.ReadStreamAsync(f.SourceFileName);
+                await using var entryStream = entry.Open();
                 await content.CopyToAsync(entryStream);
+                entryStream.Close();
</code_context>

<issue_to_address>
**suggestion (bug_risk):** Per-entry CompressionLevel is not honored when using ReadStreamAsync, unlike the other branches.

Here you always pass `options.CompressionLevel` to `CreateEntry`, whereas the other branches use `f.CompressionLevel ?? options.CompressionLevel`. With a custom stream factory, callers currently can’t override compression per entry on this path. Please align this branch with the others so per-entry compression works consistently (e.g., use `f.CompressionLevel ?? options.CompressionLevel`).

```suggestion
            if (options.ReadStreamAsync != null)
            {
                var entry = archive.CreateEntry(f.EntryName, f.CompressionLevel ?? options.CompressionLevel);
                await using var content = await options.ReadStreamAsync(f.SourceFileName);
                await using var entryStream = entry.Open();
                await content.CopyToAsync(entryStream);
                entryStream.Close();
            }
```
</issue_to_address>

### Comment 2
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:87-96` </location>
<code_context>

-        // 测试压缩多个文件夹
-        var entries = new List<string>()
+    [Fact]
+    public async Task ZipArchive_Ok()
+    {
+        var fileName = Path.Combine(AppContext.BaseDirectory, "test", "3.zip");
+        if (File.Exists(fileName))
         {
-            destFolder,
-            tempFolder,
-        };
-        entries.AddRange(files);
+            File.Delete(fileName);
+        }

-        destFile = Path.Combine(root, "folder.zip");
-        if (File.Exists(destFile))
+        using var fs = File.OpenWrite(fileName);
+        using var zip = new ZipArchive(fs, ZipArchiveMode.Create);
+
+        var item = Path.Combine(AppContext.BaseDirectory, "test", "1.txt");
+        zip.CreateEntry("text/");
+        await zip.CreateEntryFromFileAsync(item, "text/1.txt");
+    }
+
</code_context>

<issue_to_address>
**issue (testing):** ZipArchive_Ok test currently has no assertions and does not exercise the service API

Right now this only exercises `ZipArchive`/`CreateEntryFromFileAsync` and never uses `IZipArchiveService`, so it won’t detect regressions in the service and could be misleading. Please either turn it into a test of `DefaultZipArchiveService` (e.g., open the produced zip and assert the expected directory and file entries) or move/remove it so the unit test suite stays focused on the service behavior rather than framework APIs.
</issue_to_address>

### Comment 3
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:35` </location>
<code_context>
         Assert.NotNull(stream);

-        stream = await archService.ArchiveAsync(files, new ArchiveOptions()
+        stream = await archService.ArchiveAsync(items, new ArchiveOptions()
         {
             CompressionLevel = System.IO.Compression.CompressionLevel.Optimal,
</code_context>

<issue_to_address>
**suggestion (testing):** No test exercises ArchiveAsync with a non-null ReadStreamAsync to cover the async streaming branch

`DefaultZipArchiveService.ArchiveFilesAsync` has a separate path when `options.ReadStreamAsync` is non-null, using `CreateEntry` and `CopyToAsync` from the provided stream instead of reading files directly. Since all current tests set `ReadStreamAsync = null`, that path is untested. Please add or update a test to set `ReadStreamAsync` to a delegate returning a known `Stream`, call `ArchiveAsync` with `ArchiveEntry.SourceFileName` values that the delegate can map, then open the archive and assert the entry contents match the delegated stream output.

Suggested implementation:

```csharp
        // archive using file-based reading
        stream = await archService.ArchiveAsync(items, new ArchiveOptions()
        {
            CompressionLevel = System.IO.Compression.CompressionLevel.Optimal,
            Encoding = System.Text.Encoding.UTF8,
        });
        Assert.NotNull(stream);

        // archive using async streaming via ReadStreamAsync to cover streaming branch
        var entryContentMap = files.ToDictionary(
            file => Path.GetFileName(file),
            file => $"streamed-content-{Path.GetFileName(file)}");

        var streamingItems = files.Select(i => new ArchiveEntry()
        {
            SourceFileName = i,
            EntryName = Path.GetFileName(i)
        });

        stream = await archService.ArchiveAsync(streamingItems, new ArchiveOptions()
        {
            CompressionLevel = System.IO.Compression.CompressionLevel.Optimal,
            Encoding = System.Text.Encoding.UTF8,
            ReadStreamAsync = async (entry, cancellationToken) =>
            {
                // Map entry name to known content and return a stream over that content
                var key = entry.EntryName ?? Path.GetFileName(entry.SourceFileName);
                if (key == null || !entryContentMap.TryGetValue(key, out var content))
                {
                    throw new InvalidOperationException($"No test content registered for archive entry '{key}'.");
                }

                return new MemoryStream(System.Text.Encoding.UTF8.GetBytes(content));
            }
        });

        Assert.NotNull(stream);

        // verify that the archive actually contains the data produced by ReadStreamAsync
        stream.Position = 0;
        using (var archive = new System.IO.Compression.ZipArchive(stream, System.IO.Compression.ZipArchiveMode.Read, leaveOpen: true, entryNameEncoding: System.Text.Encoding.UTF8))
        {
            foreach (var kvp in entryContentMap)
            {
                var entry = archive.GetEntry(kvp.Key);
                Assert.NotNull(entry);

                using (var entryStream = entry.Open())
                using (var reader = new StreamReader(entryStream, System.Text.Encoding.UTF8))
                {
                    var actual = await reader.ReadToEndAsync();
                    Assert.Equal(kvp.Value, actual);
                }
            }
        }

        var archiveFile = Path.Combine(root, "test.zip");

```

This change assumes:
1. `ArchiveEntry` has `EntryName` and `SourceFileName` properties and that `ArchiveOptions.ReadStreamAsync` has a signature compatible with `(ArchiveEntry entry, CancellationToken cancellationToken) => Task<Stream>`. If the actual signature differs (for example, it only passes `SourceFileName` or includes additional parameters), update the lambda accordingly.
2. `using System.IO;` and `using System.IO.Compression;` directives are already present at the top of the file. If not, add them so that `MemoryStream`, `StreamReader`, and `ZipArchive` resolve correctly.
3. If your implementation of `DefaultZipArchiveService` uses a different encoding or `ZipArchive` overload, adjust the `ZipArchive` constructor call to match your conventions (e.g., omit `entryNameEncoding` if not supported in your target framework).
</issue_to_address>

### Comment 4
<location> `src/BootstrapBlazor/Services/DefaultZipArchiveService.cs:35` </location>
<code_context>
     }

-    private static async Task ArchiveFilesAsync(Stream stream, IEnumerable<string> files, ArchiveOptions? options = null)
+    private static async Task ArchiveFilesAsync(Stream stream, IEnumerable<ArchiveEntry> entries, ArchiveOptions? options = null)
     {
         options ??= new ArchiveOptions();
</code_context>

<issue_to_address>
**issue (complexity):** Consider refactoring `ArchiveFilesAsync` into smaller helper methods and cleaner branching so the different archive behaviors are separated and easier to understand and maintain.

You can keep the new functionality while reducing complexity in `ArchiveFilesAsync` by extracting the mixed concerns and removing redundant calls.

### 1. Split responsibilities in `ArchiveFilesAsync`

Currently the loop mixes three behaviors and inline logic. Extracting them into helpers keeps behavior identical and makes the flow easier to read and test:

```csharp
private static async Task ArchiveFilesAsync(Stream stream, IEnumerable<ArchiveEntry> entries, ArchiveOptions? options = null)
{
    options ??= new ArchiveOptions();
    using var archive = new ZipArchive(stream, options.Mode, options.LeaveOpen, options.Encoding);

    foreach (var f in entries)
    {
        if (options.ReadStreamAsync != null)
        {
            await AddStreamEntryAsync(archive, f, options);
        }
        else if (Directory.Exists(f.SourceFileName))
        {
            AddDirectoryEntry(archive, f, options);
        }
        else
        {
            AddFileEntry(archive, f, options);
        }
    }
}
```

### 2. Extract small helpers (name + compression + IO)

These helpers centralize name normalization and compression-level resolution, and make each branch do “one thing”:

```csharp
private static CompressionLevel GetEffectiveCompressionLevel(ArchiveEntry entry, ArchiveOptions options)
    => entry.CompressionLevel ?? options.CompressionLevel;

private static string NormalizeDirectoryEntryName(string? entryName)
{
    if (string.IsNullOrEmpty(entryName))
    {
        return string.Empty;
    }

    return entryName.EndsWith(Path.DirectorySeparatorChar)
        ? entryName
        : $"{entryName}{Path.DirectorySeparatorChar}";
}

private static async Task AddStreamEntryAsync(ZipArchive archive, ArchiveEntry f, ArchiveOptions options)
{
    var entry = archive.CreateEntry(f.EntryName, GetEffectiveCompressionLevel(f, options));

    await using var content = await options.ReadStreamAsync!(f.SourceFileName);
    await using var entryStream = entry.Open();
    await content.CopyToAsync(entryStream);
}

private static void AddDirectoryEntry(ZipArchive archive, ArchiveEntry f, ArchiveOptions options)
{
    var normalizedName = NormalizeDirectoryEntryName(f.EntryName);
    if (!string.IsNullOrEmpty(normalizedName))
    {
        archive.CreateEntry(normalizedName, GetEffectiveCompressionLevel(f, options));
    }
}

private static void AddFileEntry(ZipArchive archive, ArchiveEntry f, ArchiveOptions options)
{
    archive.CreateEntryFromFile(
        f.SourceFileName,
        f.EntryName,
        GetEffectiveCompressionLevel(f, options));
}
```

### 3. Remove redundant `Close()` with `await using`

`await using` already disposes the stream; the explicit `Close()` is unnecessary and slightly misleading:

```csharp
// Before
await using var content = await options.ReadStreamAsync(f.SourceFileName);
await using var entryStream = entry.Open();
await content.CopyToAsync(entryStream);
entryStream.Close();

// After
await using var content = await options.ReadStreamAsync(f.SourceFileName);
await using var entryStream = entry.Open();
await content.CopyToAsync(entryStream);
```

### 4. Optional: push directory knowledge into `ArchiveEntry`

If you control the `ArchiveEntry` type, you could avoid repeated filesystem checks and make the intent clearer:

```csharp
public sealed class ArchiveEntry
{
    public string SourceFileName { get; set; } = default!;
    public string EntryName { get; set; } = default!;
    public CompressionLevel? CompressionLevel { get; set; }
    public bool IsDirectory { get; set; }  // new
}
```

Then the loop becomes purely declarative:

```csharp
foreach (var f in entries)
{
    if (options.ReadStreamAsync != null)
    {
        await AddStreamEntryAsync(archive, f, options);
    }
    else if (f.IsDirectory)
    {
        AddDirectoryEntry(archive, f, options);
    }
    else
    {
        AddFileEntry(archive, f, options);
    }
}
```

All of these changes preserve the new behaviors (custom entry names, per-entry compression, directory entries, custom stream provider) while reducing nesting, centralizing repeated logic, and simplifying the main method.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@codecov
Copy link
Copy Markdown

codecov Bot commented Dec 19, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 100.00%. Comparing base (c045592) to head (f8ff481).
⚠️ Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff            @@
##              main     #7373   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files          747       748    +1     
  Lines        32776     32757   -19     
  Branches      4543      4543           
=========================================
- Hits         32776     32757   -19     
Flag Coverage Δ
BB 100.00% <100.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR refactors the IZipArchiveService interface to use a new ArchiveEntry record struct instead of simple string paths, enabling more flexible archive creation with per-entry compression settings and custom entry names. The changes also remove synchronous methods in favor of async-only alternatives.

Key Changes:

  • Introduced ArchiveEntry record struct with SourceFileName, EntryName, and optional CompressionLevel properties
  • Modified ArchiveAsync methods to accept IEnumerable<ArchiveEntry> instead of IEnumerable<string>
  • Removed synchronous methods ExtractToDirectory and ArchiveDirectory, keeping only async versions
  • Enhanced implementation to support archiving directories as entries alongside files

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 9 comments.

File Description
src/BootstrapBlazor/Services/IZipArchiveService.cs Updated interface signatures to use ArchiveEntry instead of string paths; added new ArchiveEntry record struct definition; removed obsolete XML doc tags
src/BootstrapBlazor/Services/DefaultZipArchiveService.cs Implemented new ArchiveFilesAsync logic to handle both files and directories via ArchiveEntry; removed synchronous methods; added directory entry support with proper path handling
test/UnitTest/Services/ZipArchiveServiceTest.cs Updated existing tests to use ArchiveEntry objects; added new test ZipArchive_Ok for directory entries; added comprehensive test ArchiveAsync_Ok demonstrating mixed file/directory archiving with per-entry compression levels

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread test/UnitTest/Services/ZipArchiveServiceTest.cs
Comment thread src/BootstrapBlazor/Services/DefaultZipArchiveService.cs
Comment thread src/BootstrapBlazor/Services/DefaultZipArchiveService.cs
Comment thread test/UnitTest/Services/ZipArchiveServiceTest.cs
Comment thread src/BootstrapBlazor/Services/DefaultZipArchiveService.cs Outdated
Comment thread test/UnitTest/Services/ZipArchiveServiceTest.cs
Comment thread src/BootstrapBlazor/Services/DefaultZipArchiveService.cs
Comment thread src/BootstrapBlazor/Services/IZipArchiveService.cs Outdated
Comment thread test/UnitTest/Services/ZipArchiveServiceTest.cs
@ArgoZhang
Copy link
Copy Markdown
Member Author

@sourcery-ai review

Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there - I've reviewed your changes - here's some feedback:

  • In ArchiveFilesAsync, the ReadStreamAsync branch always uses options.CompressionLevel when creating entries and ignores ArchiveEntry.CompressionLevel, which means per-entry compression settings are only honored for the file-system branch; consider using f.CompressionLevel ?? options.CompressionLevel consistently in both paths.
  • The new ArchiveEntry record struct is declared in the global namespace; if this type is intended to be part of the BootstrapBlazor.Services API surface, it would be clearer and more consistent to place it in the same namespace as IZipArchiveService.
  • When creating entries from ArchiveEntry (both directory and file cases), there is no validation for empty or null EntryName; adding a guard or a sensible default (e.g., Path.GetFileName(SourceFileName)) would avoid hard-to-diagnose runtime errors from ZipArchive.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- In `ArchiveFilesAsync`, the `ReadStreamAsync` branch always uses `options.CompressionLevel` when creating entries and ignores `ArchiveEntry.CompressionLevel`, which means per-entry compression settings are only honored for the file-system branch; consider using `f.CompressionLevel ?? options.CompressionLevel` consistently in both paths.
- The new `ArchiveEntry` record struct is declared in the global namespace; if this type is intended to be part of the `BootstrapBlazor.Services` API surface, it would be clearer and more consistent to place it in the same namespace as `IZipArchiveService`.
- When creating entries from `ArchiveEntry` (both directory and file cases), there is no validation for empty or null `EntryName`; adding a guard or a sensible default (e.g., `Path.GetFileName(SourceFileName)`) would avoid hard-to-diagnose runtime errors from `ZipArchive`.

## Individual Comments

### Comment 1
<location> `src/BootstrapBlazor/Services/DefaultZipArchiveService.cs:35-43` </location>
<code_context>
+    private static async Task ArchiveFilesAsync(Stream stream, IEnumerable<ArchiveEntry> entries, ArchiveOptions? options = null)
</code_context>

<issue_to_address>
**suggestion (bug_risk):** Per-entry CompressionLevel is not honored when using ReadStreamAsync.

In the `ReadStreamAsync != null` branch you call:
```csharp
var entry = archive.CreateEntry(f.EntryName, options.CompressionLevel);
```
so `ArchiveEntry.CompressionLevel` is ignored. In contrast, the `File.Exists` branch uses:
```csharp
archive.CreateEntryFromFile(f.SourceFileName, f.EntryName, f.CompressionLevel ?? options.CompressionLevel);
```
For consistent per-entry overrides, this branch should also use `f.CompressionLevel ?? options.CompressionLevel`.

Suggested implementation:

```csharp
            if (options.ReadStreamAsync != null)
            {

```

```csharp
                var entry = archive.CreateEntry(f.EntryName, f.CompressionLevel ?? options.CompressionLevel);

```

I only see the beginning of the `ReadStreamAsync` branch. Ensure that:
1. This `CreateEntry` call actually exists in that branch and is updated as shown.
2. `ArchiveEntry` has a `CompressionLevel` property compatible with the `ZipArchive.CreateEntry` overload taking a `CompressionLevel`.
If there are multiple `CreateEntry` calls under `ReadStreamAsync`, apply the same `f.CompressionLevel ?? options.CompressionLevel` logic to each one.
</issue_to_address>

### Comment 2
<location> `src/BootstrapBlazor/Services/DefaultZipArchiveService.cs:60-62` </location>
<code_context>
+                    archive.CreateEntry(entryName, f.CompressionLevel ?? options.CompressionLevel);
+                }
+            }
+            else if (File.Exists(f.SourceFileName))
+            {
+                archive.CreateEntryFromFile(f.SourceFileName, f.EntryName, f.CompressionLevel ?? options.CompressionLevel);
             }
         }
</code_context>

<issue_to_address>
**issue:** Guard against null or empty EntryName for file entries to avoid runtime exceptions.

In the directory branch you guard `EntryName` before calling `CreateEntry`, but in the file branch you pass `f.EntryName` directly to `CreateEntryFromFile`. If `ArchiveEntry` is created with a null/empty `EntryName`, this will throw an `ArgumentException`. Please either validate and skip/throw when `EntryName` is null or empty, or derive a default (e.g. `Path.GetFileName(f.SourceFileName)`).
</issue_to_address>

### Comment 3
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:33` </location>
<code_context>
+        var stream = await archService.ArchiveAsync(items, new ArchiveOptions()
</code_context>

<issue_to_address>
**suggestion (testing):** Add coverage for ArchiveOptions.ReadStreamAsync and per-entry compression when archiving to a stream.

`Archive_Ok` currently only covers the default `ArchiveOptions` and a non-null options instance, but not the `ReadStreamAsync` path or per-entry `CompressionLevel`. Since the code branches when `options.ReadStreamAsync != null` and uses `options.CompressionLevel` instead of `ArchiveEntry.CompressionLevel` there, please add a test that:

- Sets `ArchiveOptions.ReadStreamAsync` and includes at least one `ArchiveEntry` with a custom `CompressionLevel`.
- Asserts that the archive entries are correct and that the `ReadStreamAsync` path is exercised (e.g., by reading back known contents after extraction).

This will validate the async streaming path and the intended compression-level behavior with the new `ArchiveEntry` API.
</issue_to_address>

### Comment 4
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:157` </location>
<code_context>
+        await archService.ArchiveAsync(fileName, new List<ArchiveEntry>()
</code_context>

<issue_to_address>
**suggestion (testing):** Strengthen ArchiveAsync_Ok to assert archive contents, not just file existence.

Right now the test only asserts `File.Exists(fileName)`, which doesnt validate that the archives contents match what you configured.

Please extend it to actually inspect the zip and verify that:
- `test1/test.log` and `test2/test.log` entries exist and are files (optionally including content checks).
- Directory structure behaves as expected (e.g., directory entries `test1/` and `test2/` exist, or that file paths under those directories are correct).
- The total number of entries is as expected.

This will make the test closely validate the new `ArchiveEntry`-based behavior instead of just confirming that some file was written.

```suggestion
        Assert.True(File.Exists(fileName));

        using (var archive = System.IO.Compression.ZipFile.OpenRead(fileName))
        {
            // Ensure we have exactly the expected number of entries
            Assert.Equal(2, archive.Entries.Count);

            System.IO.Compression.ZipArchiveEntry? entry1 = null;
            System.IO.Compression.ZipArchiveEntry? entry2 = null;

            foreach (var entry in archive.Entries)
            {
                if (string.Equals(entry.FullName, "test1/test.log", StringComparison.Ordinal))
                {
                    entry1 = entry;
                }
                else if (string.Equals(entry.FullName, "test2/test.log", StringComparison.Ordinal))
                {
                    entry2 = entry;
                }
            }

            // Entries for test1/test.log and test2/test.log should exist
            Assert.NotNull(entry1);
            Assert.NotNull(entry2);

            // Sanity check that both entries are non-empty files
            Assert.NotEqual(0, entry1!.Length);
            Assert.NotEqual(0, entry2!.Length);

            // Optional: verify contents (each source file wrote a single byte 65)
            using (var stream1 = entry1.Open())
            using (var stream2 = entry2.Open())
            {
                var b1 = stream1.ReadByte();
                var b2 = stream2.ReadByte();

                Assert.Equal(65, b1);
                Assert.Equal(65, b2);

                // No extra bytes expected
                Assert.Equal(-1, stream1.ReadByte());
                Assert.Equal(-1, stream2.ReadByte());
            }
        }
```
</issue_to_address>

### Comment 5
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:88-97` </location>
<code_context>
-        // 测试压缩多个文件夹
-        var entries = new List<string>()
+    [Fact]
+    public async Task ZipArchive_Ok()
+    {
+        var fileName = Path.Combine(AppContext.BaseDirectory, "test", "3.zip");
+        if (File.Exists(fileName))
         {
-            destFolder,
-            tempFolder,
-        };
-        entries.AddRange(files);
+            File.Delete(fileName);
+        }

-        destFile = Path.Combine(root, "folder.zip");
-        if (File.Exists(destFile))
+        using var fs = File.OpenWrite(fileName);
+        using var zip = new ZipArchive(fs, ZipArchiveMode.Create);
+
+        var item = Path.Combine(AppContext.BaseDirectory, "test", "1.txt");
+        zip.CreateEntry("text/");
+        await zip.CreateEntryFromFileAsync(item, "text/1.txt");
+    }
+
</code_context>

<issue_to_address>
**question (testing):** Clarify the purpose of ZipArchive_Ok or adapt it to validate DefaultZipArchiveService behavior.

This test exercises `ZipArchive` directly rather than `IZipArchiveService`/`DefaultZipArchiveService`, and doesnt assert on directory entries, custom names, or the zip contents. Either turn it into a focused low-level `ZipArchive` test (in an appropriate fixture, with assertions on the archive), or change it to use the service API to create the zip and then verify the resulting entries/async behavior. In its current form, it doesnt validate the new service behavior and is likely redundant.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment thread src/BootstrapBlazor/Services/DefaultZipArchiveService.cs
Comment thread test/UnitTest/Services/ZipArchiveServiceTest.cs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat(IZipArchiveService): add ArchiveAsync method

2 participants