Abstract
Despite the visual importance of hair and the attention paid to hair modeling in the graphics research, modeling realistic hair still remains a very challenging task that can be performed by very few artists. In this paper we present
hair meshes
, a new method for modeling hair that aims to bring hair modeling as close as possible to modeling polygonal surfaces. This new approach provides artists with direct control of the overall shape of the hair, giving them the ability to model the exact hair shape they desire. We use the hair mesh structure for modeling the hair volume with topological constraints that allow us to automatically and uniquely trace the path of individual hair strands through this volume. We also define a set of topological operations for creating hair meshes that maintain these constraints. Furthermore, we provide a method for hiding the volumetric structure of the hair mesh from the end user, thus allowing artists to concentrate on manipulating the outer surface of the hair as a polygonal surface. We explain and show examples of how hair meshes can be used to generate individual hair strands for a wide variety of realistic hair styles.
Funder
Division of Computing and Communication Foundations
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design
Cited by
49 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. More Than Killmonger Locs: A Style Guide for Black Hair (in Computer Graphics);ACM SIGGRAPH 2024 Courses;2024-07-27
2. Real-time Physically Guided Hair Interpolation;ACM Transactions on Graphics;2024-07-19
3. Real-Time Hair Rendering with Hair Meshes;Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers '24;2024-07-13
4. Modeling Hair Strands with Roving Capsules;Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers '24;2024-07-13
5. Hair Tubes: Stylized Hair from Polygonal Meshes of Arbitrary Topology;SIGGRAPH Asia 2023 Technical Communications;2023-11-28