On memory gradient method with trust region for unconstrained optimization
Author:
Publisher
Springer Science and Business Media LLC
Subject
Applied Mathematics
Link
http://link.springer.com/content/pdf/10.1007/s11075-005-9008-0.pdf
Reference31 articles.
1. J. Barzilai and J.M. Borwein, Two-point step size gradient methods, IMA J. Numer. Anal. 8 (1988) 141–148.
2. D.P. Bertsekas, Constrained Optimization and Lagrange Multiplier Methods (Academic, New York, 1982).
3. J.W. Cantrell, Relation between the memory gradient method and the Fletcher–Reeves method, J. Optim. Theory Appl. 4 (1969) 67–71.
4. E.E. Cragg and A.V. Levy, Study on a supermemory gradient method for the minimization of functions, J. Optim. Theory Appl. 4 (1969) 191–205.
5. A.R. Conn, N. Gould, A. Startenaer and P.L. Toint, Global convergence of a class of trust region algorithms for optimization using inexact projections on convex constraints, SIAM J. Optim. 3 (1993) 164–221.
Cited by 7 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. A memory gradient method based on the nonmonotone technique;Journal of Industrial & Management Optimization;2017
2. A memory gradient method for non-smooth convex optimization;International Journal of Computer Mathematics;2014-09-16
3. A nonmonotone supermemory gradient algorithm for unconstrained optimization;Journal of Applied Mathematics and Computing;2013-12-04
4. A new variant of the memory gradient method for unconstrained optimization;Optimization Letters;2011-06-15
5. A new supermemory gradient method for unconstrained optimization problems;Optimization Letters;2011-04-17
1.学者识别学者识别
2.学术分析学术分析
3.人才评估人才评估
"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370
www.globalauthorid.com
TOP
Copyright © 2019-2024 北京同舟云网络信息技术有限公司 京公网安备11010802033243号 京ICP备18003416号-3