Bringing portraits to life

Hadar Averbuch-Elor, Daniel Cohen-Or, Johannes Kopf, Michael F. Cohen

Research output: Contribution to journalConference articlepeer-review

153 Scopus citations

Abstract

We present a technique to automatically animate a still portrait, making it possible for the subject in the photo to come to life and express various emotions. We use a driving video (of a different subject) and develop means to transfer the expressiveness of the subject in the driving video to the target portrait. In contrast to previous work that requires an input video of the target face to reenact a facial performance, our technique uses only a single target image. We animate the target image through 2D warps that imitate the facial transformations in the driving video. As warps alone do not carry the full expressiveness of the face, we add fine-scale dynamic details which are commonly associated with facial expressions such as creases and wrinkles. Furthermore, we hallucinate regions that are hidden in the input target face, most notably in the inner mouth. Our technique gives rise to reactive profiles, where people in still images can automatically interact with their viewers. We demonstrate our technique operating on numerous still portraits from the internet.

Original languageEnglish
Article numbera196
JournalACM Transactions on Graphics
Volume36
Issue number6
DOIs
StatePublished - 20 Nov 2017
EventACM SIGGRAPH Asia Conference, SA 2017 - Bangkok, Thailand
Duration: 27 Nov 201730 Nov 2017

Keywords

  • Face animation
  • Facial reenactment

Fingerprint

Dive into the research topics of 'Bringing portraits to life'. Together they form a unique fingerprint.

Cite this