Conversation
Reviewer's GuideRefactors the zip archive service API to work with a new ArchiveEntry value type, consolidates directory/archive methods around async operations, and updates tests to cover entry-based archiving, directory handling, and async extraction. Sequence diagram for ArchiveAsync with ArchiveEntry handlingsequenceDiagram
participant Caller
participant IZipArchiveService
participant DefaultZipArchiveService
participant ArchiveOptions
participant ZipArchive
participant FileSystem
Caller->>IZipArchiveService: ArchiveAsync(entries, options)
IZipArchiveService->>DefaultZipArchiveService: ArchiveAsync(entries, options)
DefaultZipArchiveService->>DefaultZipArchiveService: create MemoryStream
alt options is null
DefaultZipArchiveService->>ArchiveOptions: new ArchiveOptions()
DefaultZipArchiveService->>ArchiveOptions: set LeaveOpen = true
else options provided
DefaultZipArchiveService->>ArchiveOptions: set LeaveOpen = true
end
DefaultZipArchiveService->>ZipArchive: new ZipArchive(stream, options.Mode, options.LeaveOpen, options.Encoding)
loop for each entry in entries
alt options.ReadStreamAsync not null
DefaultZipArchiveService->>ZipArchive: CreateEntry(entry.EntryName, options.CompressionLevel)
DefaultZipArchiveService->>ArchiveOptions: ReadStreamAsync(entry.SourceFileName)
ArchiveOptions-->>DefaultZipArchiveService: content Stream
DefaultZipArchiveService->>ZipArchive: entry.Open()
DefaultZipArchiveService->>ZipArchive: CopyToAsync(content, entryStream)
else Directory.Exists(entry.SourceFileName)
DefaultZipArchiveService->>FileSystem: Directory.Exists(entry.SourceFileName)
alt EntryName not empty
DefaultZipArchiveService->>ZipArchive: CreateEntry(entry.EntryName as folder, entry.CompressionLevel or options.CompressionLevel)
end
else File.Exists(entry.SourceFileName)
DefaultZipArchiveService->>FileSystem: File.Exists(entry.SourceFileName)
DefaultZipArchiveService->>ZipArchive: CreateEntryFromFile(entry.SourceFileName, entry.EntryName, entry.CompressionLevel or options.CompressionLevel)
end
end
DefaultZipArchiveService-->>Caller: Stream (position = 0)
Class diagram for updated zip archive service APIclassDiagram
class IZipArchiveService {
<<interface>>
+Task~Stream~ ArchiveAsync(IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
+Task ArchiveAsync(string archiveFile, IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
+Task ArchiveDirectoryAsync(string archiveFile, string directoryName, CompressionLevel compressionLevel, bool includeBaseDirectory, Encoding encoding, CancellationToken token)
+Task~bool~ ExtractToDirectoryAsync(string archiveFile, string destinationDirectoryName, bool overwriteFiles, Encoding encoding, CancellationToken token)
+ZipArchiveEntry GetEntry(string archiveFile, string entryFile, bool overwriteFiles, Encoding encoding)
}
class DefaultZipArchiveService {
+Task~Stream~ ArchiveAsync(IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
+Task ArchiveAsync(string archiveFile, IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
+Task ArchiveDirectoryAsync(string archiveFile, string directoryName, CompressionLevel compressionLevel, bool includeBaseDirectory, Encoding encoding, CancellationToken token)
+Task~bool~ ExtractToDirectoryAsync(string archiveFile, string destinationDirectoryName, bool overwriteFiles, Encoding encoding, CancellationToken token)
+ZipArchiveEntry GetEntry(string archiveFile, string entryFile, bool overwriteFiles, Encoding encoding)
-static Task ArchiveFilesAsync(Stream stream, IEnumerable~ArchiveEntry~ entries, ArchiveOptions options)
}
class ArchiveEntry {
<<record struct>>
+string SourceFileName
+string EntryName
+CompressionLevel CompressionLevel
}
class ArchiveOptions {
+ZipArchiveMode Mode
+bool LeaveOpen
+Encoding Encoding
+CompressionLevel CompressionLevel
+Func~string, Task~Stream~~ ReadStreamAsync
}
class ZipArchive {
}
class ZipArchiveEntry {
}
class CompressionLevel {
}
class ZipArchiveMode {
}
class Encoding {
}
class CancellationToken {
}
class Stream {
}
IZipArchiveService <|.. DefaultZipArchiveService
DefaultZipArchiveService --> ArchiveEntry : uses
DefaultZipArchiveService --> ArchiveOptions : uses
DefaultZipArchiveService --> ZipArchive : uses
ArchiveEntry --> CompressionLevel : optional
ArchiveOptions --> CompressionLevel : uses
ArchiveOptions --> ZipArchiveMode : uses
ArchiveOptions --> Encoding : uses
ArchiveOptions --> Stream : uses
IZipArchiveService --> ArchiveEntry : API parameter
File-Level Changes
Assessment against linked issues
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Hey there - I've reviewed your changes - here's some feedback:
- In
ArchiveFilesAsync, theReadStreamAsyncbranch ignoresArchiveEntry.CompressionLeveland always usesoptions.CompressionLevel; consider honoring the per-entry compression level here for consistency with the non-async file/directory branches. - When creating directory entries you use
Path.DirectorySeparatorCharin the zip entry name; for portability and interoperability of zip archives it would be safer to normalize to'/'regardless of OS path separator. - In the
ReadStreamAsyncbranch you bothawait usingtheentryStreamand explicitly callentryStream.Close(); the explicitClose()is redundant and can be removed to simplify the resource management.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- In `ArchiveFilesAsync`, the `ReadStreamAsync` branch ignores `ArchiveEntry.CompressionLevel` and always uses `options.CompressionLevel`; consider honoring the per-entry compression level here for consistency with the non-async file/directory branches.
- When creating directory entries you use `Path.DirectorySeparatorChar` in the zip entry name; for portability and interoperability of zip archives it would be safer to normalize to `'/'` regardless of OS path separator.
- In the `ReadStreamAsync` branch you both `await using` the `entryStream` and explicitly call `entryStream.Close()`; the explicit `Close()` is redundant and can be removed to simplify the resource management.
## Individual Comments
### Comment 1
<location> `src/BootstrapBlazor/Services/DefaultZipArchiveService.cs:41-48` </location>
<code_context>
- var entry = archive.CreateEntry(Path.GetFileName(f), options.CompressionLevel);
- using var entryStream = entry.Open();
- await using var content = await options.ReadStreamAsync(f);
+ var entry = archive.CreateEntry(f.EntryName, options.CompressionLevel);
+ await using var content = await options.ReadStreamAsync(f.SourceFileName);
+ await using var entryStream = entry.Open();
await content.CopyToAsync(entryStream);
+ entryStream.Close();
</code_context>
<issue_to_address>
**suggestion (bug_risk):** Per-entry CompressionLevel is not honored when using ReadStreamAsync, unlike the other branches.
Here you always pass `options.CompressionLevel` to `CreateEntry`, whereas the other branches use `f.CompressionLevel ?? options.CompressionLevel`. With a custom stream factory, callers currently can’t override compression per entry on this path. Please align this branch with the others so per-entry compression works consistently (e.g., use `f.CompressionLevel ?? options.CompressionLevel`).
```suggestion
if (options.ReadStreamAsync != null)
{
var entry = archive.CreateEntry(f.EntryName, f.CompressionLevel ?? options.CompressionLevel);
await using var content = await options.ReadStreamAsync(f.SourceFileName);
await using var entryStream = entry.Open();
await content.CopyToAsync(entryStream);
entryStream.Close();
}
```
</issue_to_address>
### Comment 2
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:87-96` </location>
<code_context>
- // 测试压缩多个文件夹
- var entries = new List<string>()
+ [Fact]
+ public async Task ZipArchive_Ok()
+ {
+ var fileName = Path.Combine(AppContext.BaseDirectory, "test", "3.zip");
+ if (File.Exists(fileName))
{
- destFolder,
- tempFolder,
- };
- entries.AddRange(files);
+ File.Delete(fileName);
+ }
- destFile = Path.Combine(root, "folder.zip");
- if (File.Exists(destFile))
+ using var fs = File.OpenWrite(fileName);
+ using var zip = new ZipArchive(fs, ZipArchiveMode.Create);
+
+ var item = Path.Combine(AppContext.BaseDirectory, "test", "1.txt");
+ zip.CreateEntry("text/");
+ await zip.CreateEntryFromFileAsync(item, "text/1.txt");
+ }
+
</code_context>
<issue_to_address>
**issue (testing):** ZipArchive_Ok test currently has no assertions and does not exercise the service API
Right now this only exercises `ZipArchive`/`CreateEntryFromFileAsync` and never uses `IZipArchiveService`, so it won’t detect regressions in the service and could be misleading. Please either turn it into a test of `DefaultZipArchiveService` (e.g., open the produced zip and assert the expected directory and file entries) or move/remove it so the unit test suite stays focused on the service behavior rather than framework APIs.
</issue_to_address>
### Comment 3
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:35` </location>
<code_context>
Assert.NotNull(stream);
- stream = await archService.ArchiveAsync(files, new ArchiveOptions()
+ stream = await archService.ArchiveAsync(items, new ArchiveOptions()
{
CompressionLevel = System.IO.Compression.CompressionLevel.Optimal,
</code_context>
<issue_to_address>
**suggestion (testing):** No test exercises ArchiveAsync with a non-null ReadStreamAsync to cover the async streaming branch
`DefaultZipArchiveService.ArchiveFilesAsync` has a separate path when `options.ReadStreamAsync` is non-null, using `CreateEntry` and `CopyToAsync` from the provided stream instead of reading files directly. Since all current tests set `ReadStreamAsync = null`, that path is untested. Please add or update a test to set `ReadStreamAsync` to a delegate returning a known `Stream`, call `ArchiveAsync` with `ArchiveEntry.SourceFileName` values that the delegate can map, then open the archive and assert the entry contents match the delegated stream output.
Suggested implementation:
```csharp
// archive using file-based reading
stream = await archService.ArchiveAsync(items, new ArchiveOptions()
{
CompressionLevel = System.IO.Compression.CompressionLevel.Optimal,
Encoding = System.Text.Encoding.UTF8,
});
Assert.NotNull(stream);
// archive using async streaming via ReadStreamAsync to cover streaming branch
var entryContentMap = files.ToDictionary(
file => Path.GetFileName(file),
file => $"streamed-content-{Path.GetFileName(file)}");
var streamingItems = files.Select(i => new ArchiveEntry()
{
SourceFileName = i,
EntryName = Path.GetFileName(i)
});
stream = await archService.ArchiveAsync(streamingItems, new ArchiveOptions()
{
CompressionLevel = System.IO.Compression.CompressionLevel.Optimal,
Encoding = System.Text.Encoding.UTF8,
ReadStreamAsync = async (entry, cancellationToken) =>
{
// Map entry name to known content and return a stream over that content
var key = entry.EntryName ?? Path.GetFileName(entry.SourceFileName);
if (key == null || !entryContentMap.TryGetValue(key, out var content))
{
throw new InvalidOperationException($"No test content registered for archive entry '{key}'.");
}
return new MemoryStream(System.Text.Encoding.UTF8.GetBytes(content));
}
});
Assert.NotNull(stream);
// verify that the archive actually contains the data produced by ReadStreamAsync
stream.Position = 0;
using (var archive = new System.IO.Compression.ZipArchive(stream, System.IO.Compression.ZipArchiveMode.Read, leaveOpen: true, entryNameEncoding: System.Text.Encoding.UTF8))
{
foreach (var kvp in entryContentMap)
{
var entry = archive.GetEntry(kvp.Key);
Assert.NotNull(entry);
using (var entryStream = entry.Open())
using (var reader = new StreamReader(entryStream, System.Text.Encoding.UTF8))
{
var actual = await reader.ReadToEndAsync();
Assert.Equal(kvp.Value, actual);
}
}
}
var archiveFile = Path.Combine(root, "test.zip");
```
This change assumes:
1. `ArchiveEntry` has `EntryName` and `SourceFileName` properties and that `ArchiveOptions.ReadStreamAsync` has a signature compatible with `(ArchiveEntry entry, CancellationToken cancellationToken) => Task<Stream>`. If the actual signature differs (for example, it only passes `SourceFileName` or includes additional parameters), update the lambda accordingly.
2. `using System.IO;` and `using System.IO.Compression;` directives are already present at the top of the file. If not, add them so that `MemoryStream`, `StreamReader`, and `ZipArchive` resolve correctly.
3. If your implementation of `DefaultZipArchiveService` uses a different encoding or `ZipArchive` overload, adjust the `ZipArchive` constructor call to match your conventions (e.g., omit `entryNameEncoding` if not supported in your target framework).
</issue_to_address>
### Comment 4
<location> `src/BootstrapBlazor/Services/DefaultZipArchiveService.cs:35` </location>
<code_context>
}
- private static async Task ArchiveFilesAsync(Stream stream, IEnumerable<string> files, ArchiveOptions? options = null)
+ private static async Task ArchiveFilesAsync(Stream stream, IEnumerable<ArchiveEntry> entries, ArchiveOptions? options = null)
{
options ??= new ArchiveOptions();
</code_context>
<issue_to_address>
**issue (complexity):** Consider refactoring `ArchiveFilesAsync` into smaller helper methods and cleaner branching so the different archive behaviors are separated and easier to understand and maintain.
You can keep the new functionality while reducing complexity in `ArchiveFilesAsync` by extracting the mixed concerns and removing redundant calls.
### 1. Split responsibilities in `ArchiveFilesAsync`
Currently the loop mixes three behaviors and inline logic. Extracting them into helpers keeps behavior identical and makes the flow easier to read and test:
```csharp
private static async Task ArchiveFilesAsync(Stream stream, IEnumerable<ArchiveEntry> entries, ArchiveOptions? options = null)
{
options ??= new ArchiveOptions();
using var archive = new ZipArchive(stream, options.Mode, options.LeaveOpen, options.Encoding);
foreach (var f in entries)
{
if (options.ReadStreamAsync != null)
{
await AddStreamEntryAsync(archive, f, options);
}
else if (Directory.Exists(f.SourceFileName))
{
AddDirectoryEntry(archive, f, options);
}
else
{
AddFileEntry(archive, f, options);
}
}
}
```
### 2. Extract small helpers (name + compression + IO)
These helpers centralize name normalization and compression-level resolution, and make each branch do “one thing”:
```csharp
private static CompressionLevel GetEffectiveCompressionLevel(ArchiveEntry entry, ArchiveOptions options)
=> entry.CompressionLevel ?? options.CompressionLevel;
private static string NormalizeDirectoryEntryName(string? entryName)
{
if (string.IsNullOrEmpty(entryName))
{
return string.Empty;
}
return entryName.EndsWith(Path.DirectorySeparatorChar)
? entryName
: $"{entryName}{Path.DirectorySeparatorChar}";
}
private static async Task AddStreamEntryAsync(ZipArchive archive, ArchiveEntry f, ArchiveOptions options)
{
var entry = archive.CreateEntry(f.EntryName, GetEffectiveCompressionLevel(f, options));
await using var content = await options.ReadStreamAsync!(f.SourceFileName);
await using var entryStream = entry.Open();
await content.CopyToAsync(entryStream);
}
private static void AddDirectoryEntry(ZipArchive archive, ArchiveEntry f, ArchiveOptions options)
{
var normalizedName = NormalizeDirectoryEntryName(f.EntryName);
if (!string.IsNullOrEmpty(normalizedName))
{
archive.CreateEntry(normalizedName, GetEffectiveCompressionLevel(f, options));
}
}
private static void AddFileEntry(ZipArchive archive, ArchiveEntry f, ArchiveOptions options)
{
archive.CreateEntryFromFile(
f.SourceFileName,
f.EntryName,
GetEffectiveCompressionLevel(f, options));
}
```
### 3. Remove redundant `Close()` with `await using`
`await using` already disposes the stream; the explicit `Close()` is unnecessary and slightly misleading:
```csharp
// Before
await using var content = await options.ReadStreamAsync(f.SourceFileName);
await using var entryStream = entry.Open();
await content.CopyToAsync(entryStream);
entryStream.Close();
// After
await using var content = await options.ReadStreamAsync(f.SourceFileName);
await using var entryStream = entry.Open();
await content.CopyToAsync(entryStream);
```
### 4. Optional: push directory knowledge into `ArchiveEntry`
If you control the `ArchiveEntry` type, you could avoid repeated filesystem checks and make the intent clearer:
```csharp
public sealed class ArchiveEntry
{
public string SourceFileName { get; set; } = default!;
public string EntryName { get; set; } = default!;
public CompressionLevel? CompressionLevel { get; set; }
public bool IsDirectory { get; set; } // new
}
```
Then the loop becomes purely declarative:
```csharp
foreach (var f in entries)
{
if (options.ReadStreamAsync != null)
{
await AddStreamEntryAsync(archive, f, options);
}
else if (f.IsDirectory)
{
AddDirectoryEntry(archive, f, options);
}
else
{
AddFileEntry(archive, f, options);
}
}
```
All of these changes preserve the new behaviors (custom entry names, per-entry compression, directory entries, custom stream provider) while reducing nesting, centralizing repeated logic, and simplifying the main method.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #7373 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 747 748 +1
Lines 32776 32757 -19
Branches 4543 4543
=========================================
- Hits 32776 32757 -19
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Pull request overview
This PR refactors the IZipArchiveService interface to use a new ArchiveEntry record struct instead of simple string paths, enabling more flexible archive creation with per-entry compression settings and custom entry names. The changes also remove synchronous methods in favor of async-only alternatives.
Key Changes:
- Introduced
ArchiveEntryrecord struct withSourceFileName,EntryName, and optionalCompressionLevelproperties - Modified
ArchiveAsyncmethods to acceptIEnumerable<ArchiveEntry>instead ofIEnumerable<string> - Removed synchronous methods
ExtractToDirectoryandArchiveDirectory, keeping only async versions - Enhanced implementation to support archiving directories as entries alongside files
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 9 comments.
| File | Description |
|---|---|
src/BootstrapBlazor/Services/IZipArchiveService.cs |
Updated interface signatures to use ArchiveEntry instead of string paths; added new ArchiveEntry record struct definition; removed obsolete XML doc tags |
src/BootstrapBlazor/Services/DefaultZipArchiveService.cs |
Implemented new ArchiveFilesAsync logic to handle both files and directories via ArchiveEntry; removed synchronous methods; added directory entry support with proper path handling |
test/UnitTest/Services/ZipArchiveServiceTest.cs |
Updated existing tests to use ArchiveEntry objects; added new test ZipArchive_Ok for directory entries; added comprehensive test ArchiveAsync_Ok demonstrating mixed file/directory archiving with per-entry compression levels |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
@sourcery-ai review |
There was a problem hiding this comment.
Hey there - I've reviewed your changes - here's some feedback:
- In
ArchiveFilesAsync, theReadStreamAsyncbranch always usesoptions.CompressionLevelwhen creating entries and ignoresArchiveEntry.CompressionLevel, which means per-entry compression settings are only honored for the file-system branch; consider usingf.CompressionLevel ?? options.CompressionLevelconsistently in both paths. - The new
ArchiveEntryrecord struct is declared in the global namespace; if this type is intended to be part of theBootstrapBlazor.ServicesAPI surface, it would be clearer and more consistent to place it in the same namespace asIZipArchiveService. - When creating entries from
ArchiveEntry(both directory and file cases), there is no validation for empty or nullEntryName; adding a guard or a sensible default (e.g.,Path.GetFileName(SourceFileName)) would avoid hard-to-diagnose runtime errors fromZipArchive.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- In `ArchiveFilesAsync`, the `ReadStreamAsync` branch always uses `options.CompressionLevel` when creating entries and ignores `ArchiveEntry.CompressionLevel`, which means per-entry compression settings are only honored for the file-system branch; consider using `f.CompressionLevel ?? options.CompressionLevel` consistently in both paths.
- The new `ArchiveEntry` record struct is declared in the global namespace; if this type is intended to be part of the `BootstrapBlazor.Services` API surface, it would be clearer and more consistent to place it in the same namespace as `IZipArchiveService`.
- When creating entries from `ArchiveEntry` (both directory and file cases), there is no validation for empty or null `EntryName`; adding a guard or a sensible default (e.g., `Path.GetFileName(SourceFileName)`) would avoid hard-to-diagnose runtime errors from `ZipArchive`.
## Individual Comments
### Comment 1
<location> `src/BootstrapBlazor/Services/DefaultZipArchiveService.cs:35-43` </location>
<code_context>
+ private static async Task ArchiveFilesAsync(Stream stream, IEnumerable<ArchiveEntry> entries, ArchiveOptions? options = null)
</code_context>
<issue_to_address>
**suggestion (bug_risk):** Per-entry CompressionLevel is not honored when using ReadStreamAsync.
In the `ReadStreamAsync != null` branch you call:
```csharp
var entry = archive.CreateEntry(f.EntryName, options.CompressionLevel);
```
so `ArchiveEntry.CompressionLevel` is ignored. In contrast, the `File.Exists` branch uses:
```csharp
archive.CreateEntryFromFile(f.SourceFileName, f.EntryName, f.CompressionLevel ?? options.CompressionLevel);
```
For consistent per-entry overrides, this branch should also use `f.CompressionLevel ?? options.CompressionLevel`.
Suggested implementation:
```csharp
if (options.ReadStreamAsync != null)
{
```
```csharp
var entry = archive.CreateEntry(f.EntryName, f.CompressionLevel ?? options.CompressionLevel);
```
I only see the beginning of the `ReadStreamAsync` branch. Ensure that:
1. This `CreateEntry` call actually exists in that branch and is updated as shown.
2. `ArchiveEntry` has a `CompressionLevel` property compatible with the `ZipArchive.CreateEntry` overload taking a `CompressionLevel`.
If there are multiple `CreateEntry` calls under `ReadStreamAsync`, apply the same `f.CompressionLevel ?? options.CompressionLevel` logic to each one.
</issue_to_address>
### Comment 2
<location> `src/BootstrapBlazor/Services/DefaultZipArchiveService.cs:60-62` </location>
<code_context>
+ archive.CreateEntry(entryName, f.CompressionLevel ?? options.CompressionLevel);
+ }
+ }
+ else if (File.Exists(f.SourceFileName))
+ {
+ archive.CreateEntryFromFile(f.SourceFileName, f.EntryName, f.CompressionLevel ?? options.CompressionLevel);
}
}
</code_context>
<issue_to_address>
**issue:** Guard against null or empty EntryName for file entries to avoid runtime exceptions.
In the directory branch you guard `EntryName` before calling `CreateEntry`, but in the file branch you pass `f.EntryName` directly to `CreateEntryFromFile`. If `ArchiveEntry` is created with a null/empty `EntryName`, this will throw an `ArgumentException`. Please either validate and skip/throw when `EntryName` is null or empty, or derive a default (e.g. `Path.GetFileName(f.SourceFileName)`).
</issue_to_address>
### Comment 3
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:33` </location>
<code_context>
+ var stream = await archService.ArchiveAsync(items, new ArchiveOptions()
</code_context>
<issue_to_address>
**suggestion (testing):** Add coverage for ArchiveOptions.ReadStreamAsync and per-entry compression when archiving to a stream.
`Archive_Ok` currently only covers the default `ArchiveOptions` and a non-null options instance, but not the `ReadStreamAsync` path or per-entry `CompressionLevel`. Since the code branches when `options.ReadStreamAsync != null` and uses `options.CompressionLevel` instead of `ArchiveEntry.CompressionLevel` there, please add a test that:
- Sets `ArchiveOptions.ReadStreamAsync` and includes at least one `ArchiveEntry` with a custom `CompressionLevel`.
- Asserts that the archive entries are correct and that the `ReadStreamAsync` path is exercised (e.g., by reading back known contents after extraction).
This will validate the async streaming path and the intended compression-level behavior with the new `ArchiveEntry` API.
</issue_to_address>
### Comment 4
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:157` </location>
<code_context>
+ await archService.ArchiveAsync(fileName, new List<ArchiveEntry>()
</code_context>
<issue_to_address>
**suggestion (testing):** Strengthen ArchiveAsync_Ok to assert archive contents, not just file existence.
Right now the test only asserts `File.Exists(fileName)`, which doesn’t validate that the archive’s contents match what you configured.
Please extend it to actually inspect the zip and verify that:
- `test1/test.log` and `test2/test.log` entries exist and are files (optionally including content checks).
- Directory structure behaves as expected (e.g., directory entries `test1/` and `test2/` exist, or that file paths under those directories are correct).
- The total number of entries is as expected.
This will make the test closely validate the new `ArchiveEntry`-based behavior instead of just confirming that some file was written.
```suggestion
Assert.True(File.Exists(fileName));
using (var archive = System.IO.Compression.ZipFile.OpenRead(fileName))
{
// Ensure we have exactly the expected number of entries
Assert.Equal(2, archive.Entries.Count);
System.IO.Compression.ZipArchiveEntry? entry1 = null;
System.IO.Compression.ZipArchiveEntry? entry2 = null;
foreach (var entry in archive.Entries)
{
if (string.Equals(entry.FullName, "test1/test.log", StringComparison.Ordinal))
{
entry1 = entry;
}
else if (string.Equals(entry.FullName, "test2/test.log", StringComparison.Ordinal))
{
entry2 = entry;
}
}
// Entries for test1/test.log and test2/test.log should exist
Assert.NotNull(entry1);
Assert.NotNull(entry2);
// Sanity check that both entries are non-empty files
Assert.NotEqual(0, entry1!.Length);
Assert.NotEqual(0, entry2!.Length);
// Optional: verify contents (each source file wrote a single byte 65)
using (var stream1 = entry1.Open())
using (var stream2 = entry2.Open())
{
var b1 = stream1.ReadByte();
var b2 = stream2.ReadByte();
Assert.Equal(65, b1);
Assert.Equal(65, b2);
// No extra bytes expected
Assert.Equal(-1, stream1.ReadByte());
Assert.Equal(-1, stream2.ReadByte());
}
}
```
</issue_to_address>
### Comment 5
<location> `test/UnitTest/Services/ZipArchiveServiceTest.cs:88-97` </location>
<code_context>
- // 测试压缩多个文件夹
- var entries = new List<string>()
+ [Fact]
+ public async Task ZipArchive_Ok()
+ {
+ var fileName = Path.Combine(AppContext.BaseDirectory, "test", "3.zip");
+ if (File.Exists(fileName))
{
- destFolder,
- tempFolder,
- };
- entries.AddRange(files);
+ File.Delete(fileName);
+ }
- destFile = Path.Combine(root, "folder.zip");
- if (File.Exists(destFile))
+ using var fs = File.OpenWrite(fileName);
+ using var zip = new ZipArchive(fs, ZipArchiveMode.Create);
+
+ var item = Path.Combine(AppContext.BaseDirectory, "test", "1.txt");
+ zip.CreateEntry("text/");
+ await zip.CreateEntryFromFileAsync(item, "text/1.txt");
+ }
+
</code_context>
<issue_to_address>
**question (testing):** Clarify the purpose of ZipArchive_Ok or adapt it to validate DefaultZipArchiveService behavior.
This test exercises `ZipArchive` directly rather than `IZipArchiveService`/`DefaultZipArchiveService`, and doesn’t assert on directory entries, custom names, or the zip contents. Either turn it into a focused low-level `ZipArchive` test (in an appropriate fixture, with assertions on the archive), or change it to use the service API to create the zip and then verify the resulting entries/async behavior. In its current form, it doesn’t validate the new service behavior and is likely redundant.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Link issues
fixes #7371
Summary By Copilot
Regression?
Risk
Verification
Packaging changes reviewed?
☑️ Self Check before Merge
Summary by Sourcery
Introduce a more flexible zip archiving model based on archive entries and align the zip service API around asynchronous operations.
New Features:
Enhancements:
Tests: