Skip to content
forked from genforce/insgen

[ECCV 2022] FakeCLR: Exploring Contrastive Learning for Solving Latent Discontinuity in Data-Efficient GANs

License

Notifications You must be signed in to change notification settings

iceli1007/FakeCLR

 
 

Repository files navigation

FakeCLR: Exploring Contrastive Learning for Solving Latent Discontinuity in Data-Efficient GANs

image

FakeCLR: Exploring Contrastive Learning for Solving Latent Discontinuity in Data-Efficient GANs
Ziqiang Li, Chaoyue Wang, Heliang Zheng, Jing Zhang, and Bin Li

[Paper]

In this paper, we revisit and compare different contrastive learning strategies in DE-GANs, and identify (i) the current bottleneck of generative performance is the discontinuity of latent space; (ii) compared to other contrastive learning strategies, Instance-perturbation works towards latent space continuity, which brings the major improvement to DE-GANs. Based on these observations, we propose FakeCLR, which only applies contrastive learning on perturbed fake samples, and devises three related training techniques: Noise-related Latent Augmentation, Diversity-aware Queue, and Forgetting Factor of Queue. Our experimental results manifest the new state of the arts on both few-shot generation and limited-data generation. On multiple datasets, FakeCLR acquires more than 15% FID improvement compared to existing DE-GANs.

Qualitative results

image

Model

Here, some pretrained models can be downloaded

Model FID Link
Obama-100 26.95 link
Grumpy-Cat-100 19.56 link
Panda-100 8.42 link
AnimalFace Cat-160 26.34 link
AnimalFace Dog-389 42.02 link

Training

This repository is built based on styleGAN2-ada-pytorch and InsGen. Therefore, dataset processing can follow styleGAN2-ada-pytorch

Compared to InsGen, we only have one addition head (fake head) on top of the original discriminator. Furthermore, we add the iteration-based weight for negative samples in negative queue.

Please use the following command to start your own training:

python train.py --gpus=2 \
                --data=${DATA_PATH} \
                --cfg=paper256 \
                --batch=64 \
                --mirror=True \
                --kimg=10000 \
                --ada_linear=True \
                --outdir=training_example \

Acknowledgements

We thank Janne Hellsten and Tero Karras for the pytorch version codebase of their styleGAN2-ada-pytorch. We also Thank Ceyuan Yang, Yujun Shen, and Yinghao Xu for the pytorch version codebase of their InsGen.

BibTeX

@misc{li2022fakeclr,
    title={FakeCLR: Exploring Contrastive Learning for Solving Latent Discontinuity in Data-Efficient GANs},
    author={Ziqiang Li and Chaoyue Wang and Heliang Zheng and Jing Zhang and Bin Li},
    year={2022},
    eprint={2207.08630},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}

About

[ECCV 2022] FakeCLR: Exploring Contrastive Learning for Solving Latent Discontinuity in Data-Efficient GANs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 89.6%
  • Cuda 6.9%
  • C++ 3.0%
  • Other 0.5%