Download Video(MP4, 1080p, 97.8 MB)


3D hand reconstruction from images is a widely-studied problem in computer vision and graphics, and has a particularly high relevance for virtual and augmented reality. Although several 3D hand reconstruction approaches leverage hand models as a strong prior to resolve ambiguities and achieve more robust results, most existing models account only for the hand shape and poses and do not model the texture. To fill this gap, in this work we present HTML, the first parametric texture model of human hands. Our model spans several dimensions of hand appearance variability (e.g., related to gender, ethnicity, or age) and only requires a commodity camera for data acquisition. Experimentally, we demonstrate that our appearance model can be used to tackle a range of challenging problems such as 3D hand reconstruction from a single monocular image. Furthermore, our appearance model can be used to define a neural rendering layer that enables training with a self-upervised photometric loss. We make our model publicly available.



BibTeX, 1 KB

  title={{HTML: A Parametric Hand Texture Model for 3D Hand Reconstruction and Personalization}},
  author={Qian, Neng and Wang, Jiayi and Mueller, Franziska and Bernard, Florian and Golyanik, Vladislav and Theobalt, Christian},
  booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},


The authors would like to thank all participants of the data acquisition recordings.
The work was supported by the ERC Consolidator Grants 4DRepLy (770784).


Jiayi Wang

This page is Zotero and Mendeley translator friendly.

Imprint/Impressum | Data Protection/Datenschutzhinweis