解无约束优化问题的新的两点步长梯度方法(英文).docx 立即下载
2024-11-28
约3千字
约2页
0
10KB
举报 版权申诉
预览加载中,请您耐心等待几秒...

解无约束优化问题的新的两点步长梯度方法(英文).docx

解无约束优化问题的新的两点步长梯度方法(英文).docx

预览

在线预览结束,喜欢就下载吧,查找使用更方便

5 金币

下载文档

如果您无法下载资料,请参考说明:

1、部分资料下载需要金币,请确保您的账户上有足够的金币

2、已购买过的文档,再次下载不重复扣费

3、资料包下载后请先用软件解压,在使用对应软件打开

解无约束优化问题的新的两点步长梯度方法(英文)
Introduction
Unconstrainedoptimizationproblemsariseinvariousdisciplinessuchasengineering,economics,andscience.Thegoalofunconstrainedoptimizationistofindthevaluesofthedecisionvariablesthatminimizeormaximizeaspecificobjectivefunction.Thegradientdescentmethodisoneofthemostpopularoptimizationmethodsusedtosolveunconstrainedoptimizationproblems.Inthispaper,weproposeanewtwo-pointstepsizegradientdescentmethodtosolvetheunconstrainedoptimizationproblem.Thispaperwilldiscussthealgorithmofthismethod,itsconvergenceproperties,anditsadvantagesovertheexistinggradientdescentmethods.
Algorithmfortwo-pointstepsizegradientdescentmethod
Thestandardgradientdescentmethodupdatesthedecisionvariablesiterativelyasfollows:
x_k+1=x_k-t_k*∇f(x_k)
wherex_kisthevalueofthedecisionvariableatiterationk,t_kisthestepsizeatiterationk,and∇f(x_k)isthegradientoftheobjectivefunctionatx_k.
Ourproposedtwo-pointstepsizegradientdescentmethodisanextensionofthestandardgradientdescentmethod.Insteadofusingafixedstepsizeineachiteration,weuseatwo-pointstepsizethatiscomputedbasedonthecurvatureoftheobjectivefunction.Thealgorithmforthetwo-pointstepsizegradientdescentmethodisasfollows:
Step1:Initializex_0andsetk=0.
Step2:Computethegradientoftheobjectivefunctionatx_k,i.e.,∇f(x_k).
Step3:Computethefunctionvalueatthetwocandidatepointsx_k-t_k/2*∇f(x_k)andx_k-t_k*∇f(x_k),wheret_kisthestepsizeatiterationk.
Step4:Choosethecandidatepointthatproducesthesmallestfunctionvalueandsetx_k+1=x_k-t_k*∇f(x_k).
Step5:Updatethestepsizeusingtheformulat_k+1=α*t_k,whereαisaconstantbetween0and1.
Step6:Ifthestoppingcriterionissatisfied,stop.Otherwise,setk=k+1andgotostep2.
Convergenceproperties
Wenowprovetheconvergenceofthetwo-pointstepsizegradientdescentmethod.Weassumethattheobjectivefunctionf(x)iscontinuouslydifferentiableandhasauniqueglobalminimum.
Theorem:Let{x_k}bethesequencegeneratedbythetwo-pointstepsizegradientdescentmethod.Supposethatthestepsizesequence{t_k}satisfies:
1)∑(k=0)^∞t_k=∞
2)∑(k=0)^∞t_k^2<∞
Then{x_k}conv
查看更多
单篇购买
VIP会员(1亿+VIP文档免费下)

扫码即表示接受《下载须知》

解无约束优化问题的新的两点步长梯度方法(英文)

文档大小:10KB

限时特价:扫码查看

• 请登录后再进行扫码购买
• 使用微信/支付宝扫码注册及付费下载,详阅 用户协议 隐私政策
• 如已在其他页面进行付款,请刷新当前页面重试
• 付费购买成功后,此文档可永久免费下载
全场最划算
12个月
199.0
¥360.0
限时特惠
3个月
69.9
¥90.0
新人专享
1个月
19.9
¥30.0
24个月
398.0
¥720.0
6个月会员
139.9
¥180.0

6亿VIP文档任选,共次下载特权。

已优惠

微信/支付宝扫码完成支付,可开具发票

VIP尽享专属权益

VIP文档免费下载

赠送VIP文档免费下载次数

阅读免打扰

去除文档详情页间广告

专属身份标识

尊贵的VIP专属身份标识

高级客服

一对一高级客服服务

多端互通

电脑端/手机端权益通用