GANimator

Author:

Li Peizhuo1,Aberman Kfir2,Zhang Zihan3,Hanocka Rana3,Sorkine-Hornung Olga1

Affiliation:

1. ETH Zurich, Switzerland

2. Google Research

3. The University of Chicago

Abstract

We present GANimator, a generative model that learns to synthesize novel motions from a single, short motion sequence. GANimator generates motions that resemble the core elements of the original motion, while simultaneously synthesizing novel and diverse movements. Existing data-driven techniques for motion synthesis require a large motion dataset which contains the desired and specific skeletal structure. By contrast, GANimator only requires training on a single motion sequence, enabling novel motion synthesis for a variety of skeletal structures e.g. , bipeds, quadropeds, hexapeds, and more. Our framework contains a series of generative and adversarial neural networks, each responsible for generating motions in a specific frame rate. The framework progressively learns to synthesize motion from random noise, enabling hierarchical control over the generated motion content across varying levels of detail. We show a number of applications, including crowd simulation, key-frame editing, style transfer, and interactive control, which all learn from a single input sequence. Code and data for this paper are at https://peizhuoli.github.io/ganimator.

Funder

European Research Council

Publisher

Association for Computing Machinery (ACM)

Subject

Computer Graphics and Computer-Aided Design

Reference73 articles.

1. Skeleton-aware networks for deep motion retargeting

2. Unpaired motion style transfer from video to animation

3. Learning character-agnostic motion for motion retargeting in 2D

4. Adobe Systems Inc. 2021. Mixamo. https://www.mixamo.com Accessed: 2021-12-25. Adobe Systems Inc. 2021. Mixamo. https://www.mixamo.com Accessed: 2021-12-25.

5. Shailen Agrawal , Shuo Shen , and Michiel van de Panne. 2013. Diverse motion variations for physics-based character animation . In Proceedings of the 12th ACM SIGGRAPH/Eurographics Symposium on Computer Animation. 37--44 . Shailen Agrawal, Shuo Shen, and Michiel van de Panne. 2013. Diverse motion variations for physics-based character animation. In Proceedings of the 12th ACM SIGGRAPH/Eurographics Symposium on Computer Animation. 37--44.

Cited by 34 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Interactive Character Control with Auto-Regressive Motion Diffusion Models;ACM Transactions on Graphics;2024-07-19

2. MoConVQ: Unified Physics-Based Motion Control via Scalable Discrete Representations;ACM Transactions on Graphics;2024-07-19

3. Matting by Generation;Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers '24;2024-07-13

4. TEDi: Temporally-Entangled Diffusion for Long-Term Motion Synthesis;Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers '24;2024-07-13

5. Making motion matching stable and fast with Lipschitz-continuous neural networks and Sparse Mixture of Experts;Computers & Graphics;2024-05

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3