8000 GitHub - kakumarabhishek/zeta-mixup: Code for zeta-mixup (ζ-mixup), a data augmentation technique that is an N-sample generalization of mixup.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Code for zeta-mixup (ζ-mixup), a data augmentation technique that is an N-sample generalization of mixup.

License

Notifications You must be signed in to change notification settings

kakumarabhishek/zeta-mixup

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ζ-mixup

This repository hosts the code supporting our two papers on ζ-mixup.

ζ-mixup is a multi-sample mixing-based data augmentation method to generate richer and more realistic outputs. ζ-mixup is a generalization of mixup with provably and demonstrably desirable properties that allows for convex combinations of N ≥ 2 samples weighted using a p-series interpolant. ζ-mixup better preserves the intrinsic dimensionality of the original datasets, is computationally efficient, and outperforms mixup, CutMix, and traditional data augmentation methods. Here are some visualizations comparing ζ-mixup to mixup:

Half-moons dataset (N = 512)

512 samples with non-linear class boundaries distributed in the shape of interleaving crescents.

halfmoons

1-D helix embedded in $\mathbb{R}^3$ (N = 8192)

8192 samples on a 1D helix as an example of low-D manifolds lying in high-D ambient spaces.

Repository Structure

  • zeta_mixup.py: Code for ζ-mixup data augmentation.
  • utils.py: Utility functions: codes for generating the weights for ζ-mixup and cross-entropy loss with "soft" target labels.
  • mixup.py: Original mixup implementation (source). Used only for the visualizations above.
  • demo/:

Abstract

Modern deep learning training procedures rely on model regularization techniques such as data augmentation methods, which generate training samples that increase the diversity of data and richness of label information. A popular recent method, mixup, uses convex combinations of pairs of original samples to generate new samples. However, as we show in our experiments, mixup can produce undesirable synthetic samples, where the data is sampled off the manifold and can contain incorrect labels. We propose ζ-mixup, a generalization of mixup with provably and demonstrably desirable properties that allows convex combinations of T ≥ 2 samples, leading to more realistic and diverse outputs that incorporate information from T original samples by using a p-series interpolant. We show that, compared to mixup, ζ-mixup better preserves the intrinsic dimensionality of the original datasets, which is a desirable property for training generalizable models. Furthermore, we show that our implementation of ζ-mixup is faster than mixup, and extensive evaluation on controlled synthetic and 26 diverse real-world natural and medical image classification datasets shows that ζ-mixup outperforms mixup, CutMix, and traditional data augmentation techniques.

ζ-mixup overview

Citation

If you use our code, please cite our papers:

The corresponding BibTeX entries are:

@article{abhishek2024multi,
author = {Abhishek, Kumar and Brown, Colin J. and Hamarneh, Ghassan},
title = {Multi-Sample $\zeta$-mixup: Richer, More Realistic Synthetic Samples from a $p$-Series Interpolant},
journal = {Journal of Big Data},
volume = {11},
number = {1},
pages = {1--41},
month = {Mar},
year = {2024},
ISSN = {2196-1115},
url = {http://dx.doi.org/10.1186/s40537-024-00898-6},
DOI = {10.1186/s40537-024-00898-6},
publisher = {Springer}
}

@inproceedings{abhishek2023zetamixup,
title = {$\zeta$-mixup: Richer, More Realistic Mixing of Multiple Images},
author = {Kumar Abhishek and Colin Joseph Brown and Ghassan Hamarneh},
booktitle = {Medical Imaging with Deep Learning, short paper track},
year = {2023},
pages = {1--5},
url = {https://openreview.net/forum?id=iXjsAarmqn}
}

About

Code for zeta-mixup (ζ-mixup), a data augmentation technique that is an N-sample generalization of mixup.

Topics

Resources

License

Stars

Watchers

Forks

Languages

0