FaceShifter: Towards High Fidelity And Occlusion Aware Face Swapping

Lingzhi Li 1
Jianmin Bao2
Hao Yang2
Dong Chen 2
Fang Wen2
1Peking University
2Microsoft Research

Dataset [GitHub]

CVPR 2020 (Oral) [Paper]

Our face swapping results on wild face images under various challenging conditions. All results are generated using a single well-trained two-stage model.


In this work, we study various existing benchmarks for deepfake detection researches. In particular, we examine a novel two-stage face swapping algorithm, called FaceShifter, for high fidelity and occlusion aware face swapping. Unlike many existing face swapping works that leverage only limited information from the target image when synthesizing the swapped face, FaceShifter generates the swapped face with high-fidelity by exploiting and integrating the target attributes thoroughly and adaptively. FaceShifter can handle facial occlusions with a second synthesis stage consisting of a Heuristic Error Acknowledging Refinement Network (HEAR-Net), which is trained to recover anomaly regions in a self-supervised way without any manual annotations. Experiments show that existing deepfake detection algorithm performs poorly with FaceShifter, since it achieves advantageous quality over all existing benchmarks. However, our newly developed Face X-Ray [Li et al. CVPR 2020] method can reliably detect forged images created by FaceShifter.


We are very excited to announce that we are now collaborating with FaceForensic++ team to advance the face forgery detection for GAN-based face swapping methods. We generated the videos between different identities with our proposed FaceShifter and get a newly dataset called FaceForensics-Faceshifter. Many thanks to the kindly help from Rössler, Andreas and Nießner, Matthias,  you can download the dataset with above link . With the dataset you can 1)compare your own face replacement method with FaceShifter qualitatively and quantitively, 2) develop and test state-of-the-art face forgery detection algorithm on the generated videos.



Lingzhi Li, Jianmin Bao, Hao Yang, Dong Chen, Fang Wen.
FaceShifter: Towards High Fidelity And Occlusion Aware Face Swapping
In CVPR, 2020 (oral presentation). (Paper)


We'd like to thank Sicheng Xu, Yu Deng, Jiaolong Yang for helpful advice and discussion. We are grateful to Jinpeng Lin for helping with user study's webpage. Source code of this webpage was borrow from Peter Wang . The views, opinions and/or findings expressed are those of the authors and should not be interpreted as representing the official views or policies of Peking University or Microsoft Coporation.