fix(CardUpload): add duplicate filter logic#7430
Conversation
Reviewer's guide (collapsed on small PRs)Reviewer's GuideImplements duplicate filtering when merging runtime upload files into the cached upload file list, and makes a minor formatting adjustment in the DropUpload preview list markup. Sequence diagram for GetUploadFiles duplicate filtering logicsequenceDiagram
participant Caller
participant UploadBase
participant filesCache as _filesCache
participant DefaultFileList
participant UploadFiles
Caller->>UploadBase: GetUploadFiles()
alt _filesCache is null
UploadBase->>UploadBase: Initialize _filesCache as new List
alt DefaultFileList not null
UploadBase->>filesCache: AddRange(DefaultFileList)
end
loop for each file in UploadFiles
UploadBase->>filesCache: Contains(file)?
alt file not in _filesCache
UploadBase->>filesCache: Add(file)
end
end
end
UploadBase-->>Caller: _filesCache
Class diagram for UploadBase GetUploadFiles changesclassDiagram
class UploadBase {
- List~UploadFile~ _filesCache
- List~UploadFile~ DefaultFileList
- List~UploadFile~ UploadFiles
+ List~UploadFile~ GetUploadFiles()
}
class UploadFile {
}
UploadBase --> UploadFile : uses
File-Level Changes
Assessment against linked issues
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Hey - I've left some high level feedback:
- The duplicate check in
GetUploadFilesrelies on reference equality viaList.Contains, which may not prevent logical duplicates ifUploadFileinstances are reconstructed; consider using a key-based comparison (e.g., by name/ID) or a customIEqualityComparerto ensure true de-duplication. - Using
UploadFiles.ForEachwithList.Containsinside introduces an O(n²) pattern when the number of files grows; consider using aHashSetor pre-building a lookup keyed by the relevant file identifier to make the merge linear-time.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- The duplicate check in `GetUploadFiles` relies on reference equality via `List.Contains`, which may not prevent logical duplicates if `UploadFile` instances are reconstructed; consider using a key-based comparison (e.g., by name/ID) or a custom `IEqualityComparer` to ensure true de-duplication.
- Using `UploadFiles.ForEach` with `List.Contains` inside introduces an O(n²) pattern when the number of files grows; consider using a `HashSet` or pre-building a lookup keyed by the relevant file identifier to make the merge linear-time.Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #7430 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 748 748
Lines 32801 32807 +6
Branches 4552 4552
=========================================
+ Hits 32801 32807 +6
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Pull request overview
This PR adds duplicate filter logic to prevent duplicate files from being added to the upload file cache. The implementation changes how files are aggregated in the GetUploadFiles method by checking for duplicates before adding them.
- Replaced
AddRangewith aForEachloop that checks for duplicates usingContainsbefore adding files - Applied code formatting in DropUpload.razor
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| src/BootstrapBlazor/Components/Upload/UploadBase.cs | Modified GetUploadFiles method to add duplicate filtering logic when combining DefaultFileList and UploadFiles |
| src/BootstrapBlazor/Components/Upload/DropUpload.razor | Code formatting change - split long attribute line across multiple lines |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| _filesCache.AddRange(UploadFiles); | ||
| UploadFiles.ForEach(f => | ||
| { | ||
| if (!_filesCache.Contains(f)) |
There was a problem hiding this comment.
The Contains method will only perform reference equality checks since UploadFile class does not override Equals and GetHashCode. This means the duplicate filter will only prevent adding the exact same object instance, not duplicate files with the same content or properties. If the intent is to prevent duplicate files based on file properties (e.g., FileName, OriginFileName, or File reference), you should either override Equals/GetHashCode in the UploadFile class or use a different comparison method such as LINQ's Any with a custom predicate.
| UploadFiles.ForEach(f => | ||
| { | ||
| if (!_filesCache.Contains(f)) | ||
| { | ||
| _filesCache.Add(f); | ||
| } | ||
| }); |
There was a problem hiding this comment.
The new duplicate filter logic lacks test coverage. Consider adding a test case that verifies files are properly deduplicated when the same UploadFile instances are added to both DefaultFileList and UploadFiles, or when GetUploadFiles is called multiple times.
Link issues
fixes #7418
Summary By Copilot
Regression?
Risk
Verification
Packaging changes reviewed?
☑️ Self Check before Merge
Summary by Sourcery
Prevent duplicate upload entries when combining cached uploads with the current upload list.
Bug Fixes:
Enhancements: