A Swap Dominated Tensor Re-Generation Strategy for Training Deep Learning Models

Author:

Wen Lijie1,Zong Zan1,Lin Li1,Lin Leilei2

Affiliation:

1. School of Software, Tsinghua University

2. School of Management, Capital Normal University

Funder

National Key Research and Development Program of China

Publisher

IEEE

Reference39 articles.

1. A unified architecture for accelerating distributed DNN training in heterogeneous GPU/CPU clusters;jiang;Proc OSDI,2020

2. Tictac: Accelerating distributed deep learning with communication scheduling;hashemi;Proc MLSys,2019

3. On large-batch training for deep learning: Generalization gap and sharp minima;keskar;Proc ICLR,0

4. The effect of batch size on the generalizability of the convolutional neural networks on a histopathology dataset

5. PipeDream

Cited by 3 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. DeepTM: Efficient Tensor Management in Heterogeneous Memory for DNN Training;IEEE Transactions on Parallel and Distributed Systems;2024-11

2. STR: Hybrid Tensor Re-Generation to Break Memory Wall for DNN Training;IEEE Transactions on Parallel and Distributed Systems;2023-08

3. XEngine: Optimal Tensor Rematerialization for Neural Networks in Heterogeneous Environments;ACM Transactions on Architecture and Code Optimization;2022-12-16

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3