首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 140 毫秒
1.
尝试在有限存储类算法中利用目标函数值所提供的信息.首先利用插值条件构造了一个新的二次函数逼近目标函数,得到了一个新的弱割线方程,然后将此弱割线方程与袁[1]的弱割线方程相结合,给出了一族包括标准LBFGS的有限存储BFGS类算法,证明了这族算法的收敛性.从标准试验函数库CUTE中选择试验函数进行了数值试验,试验结果表明这族算法的数值表现都与标准LBFGS类似.  相似文献   

2.
李博  鲁殿军 《数学杂志》2014,34(4):773-778
本文研究了全局最优化问题.利用构造填充函数的方法,提出了一个新的无参数填充函数,它是目标函数的一个明确表达式.得到了一个新的无参数填充函数算法,数值试验结果表明该填充函数算法是有效的,从而推广了填充函数算法在求解全局最优化问题方面的应用.  相似文献   

3.
对于不可微的"极大值"形式的函数,可以利用凝聚函数对其进行光滑逼近.借助这个技术,给出了求解线性互补问题的一个具有自调节功能的内点算法.基于邻近度量和线性互补问题的标准中心化方程的关系,定义了一个新的邻近度量函数,并以极小化这个函数的最优性条件代替了该中心化方程.以此在摄动方程本身建立一种自调节的机制,从而使牛顿方向能够根据上次迭代点的信息做出自适应的调整.基于改造后的摄动方程组,建立了一个具有自调节功能的内点算法.通过一些考题对这个算法进行了数值试验,结果显示了算法的有效性和稳定性.  相似文献   

4.
针对二维非饱和土壤水分运动方程,将径向基配点法结合差分法构造了一种新的数值算法.该算法先采用差分法处理非线性项,再利用径向基函数配点法的隐格式求解方程,避免了因非线性项的存在导致不能直接使用配点法的现象,并且证明了该算法解的存在唯一性.通过对非饱和土壤水分运动的数值模拟,并采用试验数据对新算法进行了验证,模拟结果与试验结果非常吻合,表明该算法实用、有效.同时,比较分析了不同径向基函数以及不同算法的模拟精度,结果表明,与MQ函数和Guass函数相比,新的径向基函数具有更好的模拟精度,且相对于有限差分法和有限元法,本文提出的方法具有一定的优越性.  相似文献   

5.
Hash函数族是从有限集到有限集的函数的集合,任何一种编码都可以表示为一个Hash函数族,完全Hash函数族在密码学中有很重要的应用.利用矩阵和图论的方法研究了一类分离Hash函数族的结构,得到了几个分离Hash函数族的结构定理.  相似文献   

6.
基于一类带单参数γ的修正割线方程,给出了带参数γ的修正BB(BarzilaiBorwein)步长αk(γ),并在某种意义下获得了γ的一个最优取值8/3.进而,依据当前和上一次迭代点连线段上目标函数的凸性,对步长αk(γ)进行修正,并结合ZhangHager非单调线搜索技术,给出了求解无约束优化问题的一类自适应修正BB算法—AMBB算法.在适当的假设下,AMBB算法具有全局收敛性,且当目标函数为强凸函数时,AMBB算法具有线性收敛率.数值试验表明,给出的对应于参数γ取值8/3的AMBB算法是十分有效的.  相似文献   

7.
利用光滑函数建立了不等式约束优化问题KT条件的一个扰动方程组,提出了一个新的内点型算法.该算法在有限步终止时当前迭代点即为优化问题的一个精确稳定点.在一定条件下算法具有全局收敛性,数值试验表明该算法是有效的.  相似文献   

8.
绝对值方程Ax-|x|=b是一个不可微的NP-hard问题.在假设矩阵A的奇异值大于1(这里矩阵A的奇异值定义为矩阵ATA特征值的非负平方根)时,给出了求解绝对值方程一个新的光滑化算法.通过引入一种凝聚函数对绝对值方程进行光滑化处理,得到一个非线性方程组;再引入适当的目标函数,进而把绝对值方程化为无约束优化问题,然后利用拟牛顿算法对其进行求解.数值实验结果表明了该方法的正确性和有效性.  相似文献   

9.
利用前一步得到的曲率信息代替xk到xk+1段二次模型的曲率给出一个具有和BFGS类似的收敛性质的类BFGS算法,并揭示新算法与自调比拟牛顿法的关系.从试验函数库CUTE中选择标准试验函数,对比标准BFGS算法及其它改进BFGS算法进行数值试验.试验结果表明这个新算法的表现有点象自调比拟牛顿算法.  相似文献   

10.
应用双参数的类Broyden族校正公式,为研究求解无约束最优化问题的拟牛顿类算法对一般目标函数的收敛性这个开问题提供了一种新的方法.  相似文献   

11.
《Optimization》2012,61(12):2229-2246
ABSTRACT

A secant equation (quasi-Newton) has one of the most important rule to find an optimal solution in nonlinear optimization. Curvature information must satisfy the usual secant equation to ensure positive definiteness of the Hessian approximation. In this work, we present a new diagonal updating to improve the Hessian approximation with a modifying weak secant equation for the diagonal quasi-Newton (DQN) method. The gradient and function evaluation are utilized to obtain a new weak secant equation and achieve a higher order accuracy in curvature information in the proposed method. Modified DQN methods based on the modified weak secant equation are globally convergent. Extended numerical results indicate the advantages of modified DQN methods over the usual ones and some classical conjugate gradient methods.  相似文献   

12.
In this work some interesting relations between results on basic optimization and algorithms for nonconvex functions (such as BFGS and secant methods) are pointed out. In particular, some innovative tools for improving our recent secant BFGS-type and algorithms are described in detail.  相似文献   

13.
This paper presents a modified quasi-Newton method for structured unconstrained optimization. The usual SQN equation employs only the gradients, but ignores the available function value information. Several researchers paid attention to other secant conditions to get a better approximation of the Hessian matrix of the objective function. Recently Yabe et al. (2007) [6] proposed the modified secant condition which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objective function. In this paper, we derive a new progressive modified SQN equation, with a vector parameter which use both available gradient and function value information, that maintains most properties of the usual and modified structured quasi-Newton methods. Furthermore, local and superlinear convergence of the algorithm is obtained under some reasonable conditions.  相似文献   

14.
Symmetric rank-one (SR1) is one of the competitive formulas among the quasi-Newton (QN) methods. In this paper, we propose some modified SR1 updates based on the modified secant equations, which use both gradient and function information. Furthermore, to avoid the loss of positive definiteness and zero denominators of the new SR1 updates, we apply a restart procedure to this update. Three new algorithms are given to improve the Hessian approximation with modified secant equations for the SR1 method. Numerical results show that the proposed algorithms are very encouraging and the advantage of the proposed algorithms over the standard SR1 and BFGS updates is clearly observed.  相似文献   

15.
In this paper, an adaptive trust region algorithm that uses Moreau–Yosida regularization is proposed for solving nonsmooth unconstrained optimization problems. The proposed algorithm combines a modified secant equation with the BFGS update formula and an adaptive trust region radius, and the new trust region radius utilizes not only the function information but also the gradient information. The global convergence and the local superlinear convergence of the proposed algorithm are proven under suitable conditions. Finally, the preliminary results from comparing the proposed algorithm with some existing algorithms using numerical experiments reveal that the proposed algorithm is quite promising for solving nonsmooth unconstrained optimization problems.  相似文献   

16.
本文提供了一簇新的过滤线搜索修正正割方法求解非线性等式约束优化问题.新算法簇的特点是:用修正正割算法簇中的一个算法获得搜索方向,回代线搜索技术得到步长,过滤准则用来决定是否接受步长,引入二阶校正技术减少不可行性并克服Maratos效应.在合理的假设条件下,分析了算法的总体收敛性.并证明了,通过附加二阶校正步,算法簇克服了Maratos效应,并二步Q-超线性收敛到满足二阶充分最优条件的局部解.数值结果表明了所提供的算法具有有效性.  相似文献   

17.
For solving unconstrained minimization problems, quasi-Newton methods are popular iterative methods. The secant condition which employs only the gradient information is imposed on these methods. Several researchers paid attention to other secant conditions to get a better approximation of the Hessian matrix of the objective function. Recently, Zhang et al. [New quasi-Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl. 102 (1999) 147–167] and Zhang and Xu [Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations, J. Comput. Appl. Math. 137 (2001) 269–278] proposed the modified secant condition which uses both gradient and function value information in order to get a higher order accuracy in approximating the second curvature of the objective function. They showed the local and q-superlinear convergence property of the BFGS-like and DFP-like updates based on their proposed secant condition. In this paper, we incorporate one parameter into this secant condition to smoothly switch the standard secant condition and the secant condition of Zhang et al. We consider a modified Broyden family which includes the BFGS-like and the DFP-like updates proposed by Zhang et al. We prove the local and q-superlinear convergence of our method.  相似文献   

18.
朱德通 《应用数学》1999,12(2):65-71
基于Powell和Yuan所建议的近似Fetcher罚函数作为函数使用单调线搜索的技术,本文提供了一类正割方法解约束优化。在合理的条件下,证明了所提供的算法的整体收敛性和收敛速率。  相似文献   

19.
This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (L-BFGS) and a Hessian-free Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about the objective function is stored in the form of a limited memory matrix, and plays the dual role of preconditioning the inner conjugate gradient iteration in the HFN method and of providing an initial matrix for L-BFGS iterations. The lengths of the L-BFGS and HFN cycles are adjusted dynamically during the course of the optimization. Numerical experiments indicate that the new algorithms are both effective and not sensitive to the choice of parameters.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号