首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
改进HS共轭梯度算法及其全局收敛性   总被引:14,自引:0,他引:14  
时贞军 《计算数学》2001,23(4):393-406
1.引 言 1952年 M.Hestenes和E.Stiefel提出了求解正定线性方程组的共轭梯度法[1].1964年R.Fletcher和C.Reeves将该方法推广到求解下列无约束优化问题: minf(x),x∈Rn,(1)其中f:Rn→R1为连续可微函数,记gk= f(xk),xk∈ Rn. 若点列{xk}由如下算法产生:其中 βk=[gTk(gk-gk-1)]/[dTk-1(gk-gk-1)].(Hestenes-Stiefel)  (4)则称该算法为 Hestenes—Stiefel共轭梯度算…  相似文献   

2.
本文研究微分方程组xi=Fi(x1,…xn)(x∈Rn+)解的收敛性.如果该系统满足下列条件:(i)F(O)O;(i)Fi(x1,…xn)关于xk是单调增的(k≠i);(ii)F(x*g(s))h(s)*F(x)(0s1),这里x*y=(x1y1,…xnyn),g,h:[0,1]→[0,1]n满足gi(0)=hi(0)=O,gi(1)=hi(1)=1,O<gi(s),hi(s)<1,s∈(0,1);(iv)系统的每个解在Rn+中有界,则每个解收敛于奇点.本文还把这一结果推广到离散的序保持动力系统.  相似文献   

3.
1引言考虑用基于修正内罚函数的常微分方程(MBF-ODE)方法求解下列不等式约束极小化问题:其中fi∈c2:R,i=0,1,…,m.求解无约束极小化问题的ODE的一般形式是其中,φ(x)∈C1:ΩRn→R;s(x)∈C1:ΩRn→Rn且满足φ(x)>0,sT(x)f(x)<0,f(x)∈C1:Rn→R为目标函数.为便于用ODE方法求解(1.l),可藉助于罚函数将(1.l)变换为无约束极小化问题(见[7].但由于经典罚函数(CBF)在计算上有较大的困难,我们采用修正内罚函数(MBF).其基本思想是用…  相似文献   

4.
1引言 考虑如下的具有二角结构的作线性系统LPTS(linearly parameterized triangularstructure).其中x=(x1,.……xn)T0∈Rn为系统的状态,u∈R为控制输入,θ∈Rp为未知常值参数向量,γi0及γi′,β′(x)β0为已知的光滑非线性函数,γi0(0)=0,γi′(0)=0,β0(0)≠0;(i=1,…,n),对这样的系统,D.Seto,A.M.Annaswamy和J.Baillieul在[1]中进行了讨论,但是他们对同一参数用了多个估计,使得系…  相似文献   

5.
1 引言 设X是实的Banach空间,S   X是闭子集. 考虑下述多目标优化问题:其中fk,k∈N≡{1,…,n},gi,i∈M≡{1,…,m},hi,j∈P≡{1,…,p}均是定义在某开集(包含S)上的局部Lipschitz函数. 集合S0={x∈S:gi(x)≤0,i∈M,,hi(x)=0,j∈P}称为(VP)的可行解集.(VP)的局部有效解和局部弱有效解的定义见[2].设φ:X→R是局部Lipschitz函数,则 φ(x)称为φ在x处的Clarke广义梯度[3]. 关于非光滑多目标优化问题(V…  相似文献   

6.
箱约束变分不等式的一种新NCP-函数及其广义牛顿法   总被引:6,自引:0,他引:6  
陈国庆  曹兵 《计算数学》2002,24(1):91-4
1.引 言设,变分不等式,记为VI(X,F),是指:求x=X使记为箱式约束时,称 VI(X,F)为箱约束变分不等式,记为 VI([a,b],F).若ai=0,bi=+∞,                         为非线性互补问题NCP(F):求x∈R  相似文献   

7.
岳优兰  王月山 《数学季刊》1999,14(2):108-110
§1. IntroductionAlocallyintegrablefunctionf(x)belongstoLipα(Rn),ifthereisaconstantC,suchthatforeveryx,y∈Rn|f(x)-f(y)|≤C|x-y|α  ThesmallestconstantCsatisfiesaboveiscalledLipschitznormoffandisdenotedbyyfy∧α.By[1],f∈Lipα(Rn)equivalenttof∈εα,2,whereεα,2=…  相似文献   

8.
朱建青  靳丽丽 《数学季刊》1999,14(1):102-110
§1. IntroductionWeconsiderthefollowingoptimizationproblem:(P)  minx∈Xf(x)(1)whereX={x|x∈En,gj(x)≤0,j=1,2,…,m;gj(x)=0,j=m+1,…,m+l},letI={1,2,…,m},L={m+1,…,m+l}.Fortheproblem(P)withL=,manyefficientprojectiontypealgorithms[1-11]havebeenproposed.In[12],…  相似文献   

9.
高维空间中半线性波动方程的Sobolev指数   总被引:6,自引:0,他引:6  
GustavoPonce与ThomasC.Sideris[4]猜测对一些具有特殊非线性项的半线性波动方程,如ut-△u=uk(Du)α(x∈Rn,k∈Z+,l=|α|2),其中Sobolev指数会在n2与(n2+1)之间.文[4]中,在x∈R3时,回答了这一问题.本文在n3维空间中,得到了半线性波动方程ut-△u=uk(Du)α(x∈Rn,k∈Z+,l=|α|2)的Sobolev指数为max{n2+12,(n2-1)·l-3l-1+2},此数确实在区间[n2+12,n2+1]中.  相似文献   

10.
1.引言考虑奇阶非线性泛函微分方程[x(t)-cx(t—()](n)+p(t)f(x(t-σ))=0(1)对方程(1)我们作如下假设(H):(H1)n>1是奇整数,p∈C((t0,∞),(t0,∞));(H2)τ>0,σ>0且0≤c≤1;(H3)f∈C(R,R)是单调增加,xf(x)>0,X≠0且当|x|→∞时有|f(t)|→∞.设δ=max{τ,σ},∈C([T-δ,T],R).方程(1)在[T,∞)上的解是指函数x∈C([T,∞),R),使得x(t)=((t),T-δ≤t≤T,[x(t)-cx…  相似文献   

11.
During the last few years, conjugate-gradient methods have been found to be the best available tool for large-scale minimization of nonlinear functions occurring in geophysical applications. While vectorization techniques have been applied to linear conjugate-gradient methods designed to solve symmetric linear systems of algebraic equations, arising mainly from discretization of elliptic partial differential equations, due to their suitability for vector or parallel processing, no such effort was undertaken for the nonlinear conjugate-gradient method for large-scale unconstrained minimization.Computational results are presented here using a robust memoryless quasi-Newton-like conjugate-gradient algorithm by Shanno and Phua applied to a set of large-scale meteorological problems. These results point to the vectorization of the conjugate-gradient code inducing a significant speed-up in the function and gradient evaluation for the nonlinear conjugate-gradient method, resulting in a sizable reduction in the CPU time for minimizing nonlinear functions of 104 to 105 variables. This is particularly true for many real-life problems where the gradient and function evaluation take the bulk of the computational effort.It is concluded that vector computers are advantageous for largescale numerical optimization problems where local minima of nonlinear functions are to be found using the nonlinear conjugate-gradient method.This research was supported by the Florida State University Supercomputer Computations Research Institute, which is partially funded by the US Department of Energy through Contract No. DE-FC05-85ER250000.  相似文献   

12.
In 1952, Hestenes and Stiefel first established, along with the conjugate-gradient algorithm, fundamental relations which exist between conjugate direction methods for function minimization on the one hand and Gram-Schmidt processes relative to a given positive-definite, symmetric matrix on the other. This paper is based on a recent reformulation of these relations by Hestenes which yield the conjugate Gram-Schmidt (CGS) algorithm. CGS includes a variety of function minimization routines, one of which is the conjugate-gradient routine. This paper gives the basic equations of CGS, including the form applicable to minimizing general nonquadratic functions ofn variables. Results of numerical experiments of one form of CGS on five standard test functions are presented. These results show that this version of CGS is very effective.The preparation of this paper was sponsored in part by the US Army Research Office, Grant No. DH-ARO-D-31-124-71-G18.The authors wish to thank Mr. Paul Speckman for the many computer runs made using these algorithms. They served as a good check on the results which they had obtained earlier. Special thanks must go to Professor M. R. Hestenes whose constant encouragement and assistance made this paper possible.  相似文献   

13.
A conjugate-gradient optimization method which is invariant to nonlinear scaling of a quadratic form is introduced. The technique has the property that the search directions generated are identical to those produced by the classical Fletcher-Reeves algorithm applied to the quadratic form. The approach enables certain nonquadratic functions to be minimized in a finite number of steps. Several examples which illustrate the efficacy of the method are included.  相似文献   

14.
解新锥模型信赖域子问题的折线法   总被引:1,自引:0,他引:1  
本文以新锥模型信赖域子问题的最优性条件为理论基础,认真讨论了新子问题的锥函数性质,分析了此函数在梯度方向及与牛顿方向连线上的单调性.在此基础上本文提出了一个求解新锥模型信赖域子问题折线法,并证明了这一子算法保证解无约束优化问题信赖域法全局收敛性要满足的下降条件.本文获得的数值实验表明该算法是有效的.  相似文献   

15.
A Conic Trust-Region Method for Nonlinearly Constrained Optimization   总被引:5,自引:0,他引:5  
Trust-region methods are powerful optimization methods. The conic model method is a new type of method with more information available at each iteration than standard quadratic-based methods. Can we combine their advantages to form a more powerful method for constrained optimization? In this paper we give a positive answer and present a conic trust-region algorithm for non-linearly constrained optimization problems. The trust-region subproblem of our method is to minimize a conic function subject to the linearized constraints and the trust region bound. The use of conic functions allows the model to interpolate function values and gradient values of the Lagrange function at both the current point and previous iterate point. Since conic functions are the extension of quadratic functions, they approximate general nonlinear functions better than quadratic functions. At the same time, the new algorithm possesses robust global properties. In this paper we establish the global convergence of the new algorithm under standard conditions.  相似文献   

16.
A conjugate-gradient method for unconstrained optimization, which is based on a nonquadratic model, is proposed. The technique has the same properties as the Fletcher-Reeves algorithm when applied to a quadratic function. It is shown to be efficient when tried on general functions of different dimensionality.  相似文献   

17.
It is known that the conjugate-gradient algorithm is at least as good as the steepest-descent algorithm for minimizing quadratic functions. It is shown here that the conjugate-gradient algorithm is actually superior to the steepest-descent algorithm in that, in the generic case, at each iteration it yields a lower cost than does the steepest-descent algorithm, when both start at the same point.Thanks are due to Professor R. W. Sargent, Imperial College, London, England, for suggestions concerning presentation.  相似文献   

18.
The effect of nonlinearly scaling the objective function on the variable-metric method is investigated, and Broyden's update is modified so that a property of invariancy to the scaling is satisfied. A new three-parameter class of updates is generated, and criteria for an optimal choice of the parameters are given. Numerical experiments compare the performance of a number of algorithms of the resulting class.The author is indebted to Professor S. S. Oren, Economic Engineering Department, Stanford University, Stanford, California, for stimulating discussions during the development of this paper. He also recognizes the financial support by the National Research Council of Italy (CNR) for his stay at Stanford University.  相似文献   

19.
In this paper, a unified method to construct quadratically convergent algorithms for function minimization is described. With this unified method, a generalized algorithm is derived. It is shown that all the existing conjugate-gradient algorithms and variable-metric algorithms can be obtained as particular cases. In addition, several new practical algorithms can be generated. The application of these algorithms to quadratic functions as well as nonquadratic functions is discussed.This research, supported by the Office of Scientific Research, Office of Aerospace Research, United States Air Force, Grant No. AF-AFOSR-828-67, is based on Ref. 1.  相似文献   

20.
Making use of the Carlson-Shaffer convolution operator, we introduce and study a new class of analytic functions related to conic domains. The main object of this paper is to investigat inclusion relations, coefficient bound for this class. We also show that this class is closed under convolution with a convex function. Some applications are also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号