8000 GitHub - jasuriy/G2Face: Official PyTorch Implementation for G2Face: High-Fidelity Reversible Face Anonymization via Generative and Geometric Priors (TIFS-2024)
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
/ G2Face Public
forked from Harxis/G2Face

Official PyTorch Implementation for G2Face: High-Fidelity Reversible Face Anonymization via Generative and Geometric Priors (TIFS-2024)

License

Notifications You must be signed in to change notification settings

jasuriy/G2Face

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

G2Face: High-Fidelity Reversible Face Anonymization via Generative and Geometric Priors
Official PyTorch Implementation

This repo contains the training and inference code for our G2Face paper.

G2Face: High-Fidelity Reversible Face Anonymization via Generative and Geometric Priors | arxiv | ieeexplore |
Haoxin Yang, Xuemiao Xu, Cheng Xu, Huaidong Zhang, Jing Qin, Yi Wang, Pheng-Ann Heng, Shengfeng He

South China University of Technology, The Hong Kong Polytechnic University, Dongguan University of Technology, The Chinese University of Hong Kong, Singapore Management University.

Abstract


  Reversible face anonymization, unlike traditional face pixelization, seeks to replace sensitive identity information in facial images with synthesized alternatives, preserving privacy without sacrificing image clarity. Traditional methods, such as encoder-decoder networks, often result in significant loss of facial details due to their limited learning capacity. Additionally, relying on latent manipulation in pre-trained GANs can lead to changes in ID-irrelevant attributes, adversely affecting data utility due to GAN inversion inaccuracies. This paper introduces G2Face, which leverages both generative and geometric priors to enhance identity manipulation, achieving high-quality reversible face anonymization without compromising data utility. We utilize a 3D face model to extract geometric information from the input face, integrating it with a pre-trained GAN-based decoder. This synergy of generative and geometric priors allows the decoder to produce realistic anonymized faces with consistent geometry. Moreover, multi-scale facial features are extracted from the original face and combined with the decoder using our novel identity-aware feature fusion blocks (IFF). This integration enables precise blending of the generated facial patterns with the original ID-irrelevant features, resulting in accurate identity manipulation. Extensive experiments demonstrate that our method outperforms existing state-of-the-art techniques in face anonymization and recovery, while preserving high data utility.

Installation

pip install -r requirements.txt 

Then download the weights at Google Drive or Baidu Netdisk and move them to weights/ and model/d3dfr/BFM/.

Training

sh ./train.sh

Test

python test.py --celebahq_path YOUR_CELEBAHQ_PATH 

Acknowledgement

This repository borrows from stylegan2-pytorch, insightface and Deep3DFaceRecon_pytorch.

We thank for their great work and sharing code.

License

This project is released under the MIT license. Please see the LICENSE file for more information.

Citation

If you find this repository helpful, please consider citing:

@article{Yang2024G2face,
  author={Yang, Haoxin and Xu, Xuemiao and Xu, Cheng and Zhang, Huaidong and Qin, Jing and Wang, Yi and Heng, Pheng-Ann and He, Shengfeng},
  journal={IEEE Transactions on Information Forensics and Security}, 
  title={G2Face: High-Fidelity Reversible Face Anonymization via Generative and Geometric Priors}, 
  year={2024},
  doi={10.1109/TIFS.2024.3449104}}
}

About

Official PyTorch Implementation for G2Face: High-Fidelity Reversible Face Anonymization via Generative and Geometric Priors (TIFS-2024)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 95.6%
  • Cuda 3.6%
  • Other 0.8%
0