Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Feature】Class balanced sampling at the Batch level #1753

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

bobo0810
Copy link
Contributor

@bobo0810 bobo0810 commented Aug 7, 2023

Motivation

At present, most of the datasets have long tail distribution or serious imbalance of categories, which leads to poor classification recognition ability of the model for a small number of samples, and can not improve the generalization ability.

Modification

Added a sampling strategy: Batch-based class-based balanced sampling

BC-breaking (Optional)

Does the modification introduce changes that break the backward compatibility of the downstream repositories?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

train_dataloader = dict(
           xxxx,
           sampler=dict(type="BatchBalanceSampler", num_per_class=4),
       )

Checklist

Before PR:

  • Pre-commit or other linting tools are used to fix the potential lint issues.
  • Bug fixes are fully covered by unit tests, the case that causes the bug should be added in the unit tests.
  • The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  • The documentation has been modified accordingly, like docstring or example tutorials.

After PR:

  • If the modification has potential influence on downstream or other related projects, this PR should be tested with those projects, like MMDet or MMSeg.
  • CLA has been signed and all committers have signed the CLA in this PR.

@codecov
Copy link

codecov bot commented Aug 7, 2023

Codecov Report

Patch coverage: 96.00% and project coverage change: -19.91% ⚠️

Comparison is base (89eaa92) 85.22% compared to head (fe26f47) 65.32%.
Report is 100 commits behind head on main.

❗ Current head fe26f47 differs from pull request most recent head 63c7d9b. Consider uploading reports for the commit 63c7d9b to get more accurate results

Additional details and impacted files
@@             Coverage Diff             @@
##             main    #1753       +/-   ##
===========================================
- Coverage   85.22%   65.32%   -19.91%     
===========================================
  Files         229      359      +130     
  Lines       17243    26039     +8796     
  Branches     2707     4141     +1434     
===========================================
+ Hits        14696    17010     +2314     
- Misses       2046     8411     +6365     
- Partials      501      618      +117     
Flag Coverage Δ
unittests 65.32% <96.00%> (-19.91%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files Changed Coverage Δ
configs/_base_/datasets/imagenet_bs128_mbv3.py 100.00% <ø> (ø)
configs/_base_/datasets/imagenet_bs32.py 100.00% <ø> (ø)
...onfigs/_base_/datasets/imagenet_bs32_pil_resize.py 100.00% <ø> (ø)
configs/_base_/datasets/imagenet_bs64_swin_224.py 100.00% <ø> (ø)
configs/_base_/datasets/imagenet_bs64_swin_384.py 100.00% <ø> (ø)
configs/hivit/hivit-tiny-p16_16xb64_in1k.py 50.00% <50.00%> (ø)
configs/_base_/datasets/imagenet_bs64_hivit_224.py 100.00% <100.00%> (ø)
configs/_base_/models/hivit/tiny_224.py 100.00% <100.00%> (ø)
...gs/_base_/schedules/imagenet_bs1024_adamw_hivit.py 100.00% <100.00%> (ø)
configs/dinov2/vit-base-p14_dinov2-pre_headless.py 100.00% <100.00%> (ø)
... and 1 more

... and 174 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant