首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
牛顿迭代法与几种改进格式的效率指数   总被引:2,自引:1,他引:1  
研究牛顿迭代、牛顿弦截法以及它们的六种改进格式的计算效率,计算了它们的效率指数,得到牛顿迭代、改进牛顿法、弦截法和改进弦截法(即所谓牛顿迭代的P.C格式)、二次插值迭代格式、推广的牛顿迭代法、调和平均牛顿法和中点牛顿法的效率指数分别为0.347/n、0.3662/n、0.4812/n、0.4812/n、0.347/n、0.3662/n、0.3662/n、0.3662/n.我们的结果显示,利用抛物插值多项式推出的迭代格式和改进弦截法并没有真正提高迭代的计算效率.此外,我们还证明了改进弦截法与牛顿弦截法等价,并利用这一结论给出了改进弦截法收敛阶为2.618的一个简化证明.  相似文献   

2.
《Journal of Complexity》1999,15(2):239-281
We present new results concerning the convergence of secant type methods with only conditions at a point. The radius of robustness of these methods is given, and we apply it to the study of the complexity of homotopy methods for approximating roots. In particular, we say how to use the secant type method to get an approximate zero relative to the Newton method.  相似文献   

3.
We present a directional secant method, a secant variant of the directional Newton method, for solving a single nonlinear equation in several variables. Under suitable assumptions, we prove the convergence and the quadratic convergence speed of this new method. Numerical examples show that the directional secant method is feasible and efficient, and has better numerical behaviour than the directional Newton method.  相似文献   

4.
For solving unconstrained minimization problems, quasi-Newton methods are popular iterative methods. The secant condition which employs only the gradient information is imposed on these methods. Several researchers paid attention to other secant conditions to get a better approximation of the Hessian matrix of the objective function. Recently, Zhang et al. [New quasi-Newton equation and related methods for unconstrained optimization, J. Optim. Theory Appl. 102 (1999) 147–167] and Zhang and Xu [Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations, J. Comput. Appl. Math. 137 (2001) 269–278] proposed the modified secant condition which uses both gradient and function value information in order to get a higher order accuracy in approximating the second curvature of the objective function. They showed the local and q-superlinear convergence property of the BFGS-like and DFP-like updates based on their proposed secant condition. In this paper, we incorporate one parameter into this secant condition to smoothly switch the standard secant condition and the secant condition of Zhang et al. We consider a modified Broyden family which includes the BFGS-like and the DFP-like updates proposed by Zhang et al. We prove the local and q-superlinear convergence of our method.  相似文献   

5.
The secant method is one of the most popular methods for root finding. Standard text books in numerical analysis state that the secant method is superlinear: the rate of convergence is set by the gold number. Nevertheless, this property holds only for simple roots. If the multiplicity of the root is larger than one, the convergence of the secant method becomes linear. This communication includes a detailed analysis of the secant method when it is used to approximate multiple roots. Thus, a proof of the linear convergence is shown. Moreover, the values of the corresponding asymptotic convergence factors are determined and are found to be also related with the golden ratio.  相似文献   

6.
An order interval secant method is given. Its rate of convergence is faster than that of order interval Newton method in [1]. The existence and uniqueness of a solution to nonlinear systems and convergence of the interval iterative sequence are also proved.  相似文献   

7.
This paper presents a modified quasi-Newton method for structured unconstrained optimization. The usual SQN equation employs only the gradients, but ignores the available function value information. Several researchers paid attention to other secant conditions to get a better approximation of the Hessian matrix of the objective function. Recently Yabe et al. (2007) [6] proposed the modified secant condition which uses both gradient and function value information in order to get a higher-order accuracy in approximating the second curvature of the objective function. In this paper, we derive a new progressive modified SQN equation, with a vector parameter which use both available gradient and function value information, that maintains most properties of the usual and modified structured quasi-Newton methods. Furthermore, local and superlinear convergence of the algorithm is obtained under some reasonable conditions.  相似文献   

8.
关于多元非线性方程的Broyden方法   总被引:2,自引:0,他引:2  
安恒斌  白中治 《计算数学》2004,26(4):385-400
本文提出了求解多元非线性方程的Broyden方法,讨论了该方法的局部与半局部收敛性,并估计了其超线性收敛速度.数值实验表明,新方法是可行有效的,并且其计算效率高于方向Newton法和方向割线法.  相似文献   

9.
针对非线性方程求单根问题,提出了一种新的Newton预测-校正格式.通过每步迭代增加计算一个函数值和一阶导数值,使得每步迭代需要估计两个函数值和两个一阶导数值.与标准的Newton算法的二阶收敛速度相比,新算法具有更高阶的收敛速度2+\sqrt{6}.通过测试函数对新算法进行测试, 与相关算法比较,表明算法在迭代次数、运算时间及最优值方面都具有较明显的优势. 最后,将这种新格式推广到多维向量值函数, 采用泰勒公式证明了其收敛性,并给出了两个二维算例来验证其收敛的有效性.  相似文献   

10.
Conjugate gradient methods are appealing for large scale nonlinear optimization problems. Recently, expecting the fast convergence of the methods, Dai and Liao (2001) used secant condition of quasi-Newton methods. In this paper, we make use of modified secant condition given by Zhang et al. (1999) and Zhang and Xu (2001) and propose a new conjugate gradient method following to Dai and Liao (2001). It is new features that this method takes both available gradient and function value information and achieves a high-order accuracy in approximating the second-order curvature of the objective function. The method is shown to be globally convergent under some assumptions. Numerical results are reported.  相似文献   

11.
It is well known that trust region methods are very effective for optimization problems. In this article, a new adaptive trust region method is presented for solving unconstrained optimization problems. The proposed method combines a modified secant equation with the BFGS updated formula and an adaptive trust region radius, where the new trust region radius makes use of not only the function information but also the gradient information. Under suitable conditions, global convergence is proved, and we demonstrate the local superlinear convergence of the proposed method. The numerical results indicate that the proposed method is very efficient.  相似文献   

12.
The Powell singular function was introduced 1962 by M.J.D. Powell as an unconstrained optimization problem. The function is also used as nonlinear least squares problem and system of nonlinear equations. The function is a classic test function included in collections of test problems in optimization as well as an example problem in text books. In the global optimization literature the function is stated as a difficult test case. The function is convex and the Hessian has a double singularity at the solution. In this paper we consider Newton’s method and methods in Halley class and we discuss the relationship between these methods on the Powell Singular Function. We show that these methods have global but linear rate of convergence. The function is in a subclass of unary functions and results for Newton’s method and methods in the Halley class can be extended to this class. Newton’s method is often made globally convergent by introducing a line search. We show that a full Newton step will satisfy many of standard step length rules and that exact line searches will yield slightly faster linear rate of convergence than Newton’s method. We illustrate some of these properties with numerical experiments.  相似文献   

13.
14.
Recently, a modification of the Newton method for finding a zero of a univariate function with local cubic convergence has been introduced. Here, we extend this modification to the multi-dimensional case, i.e., we introduce a modified Newton method for vector functions that converges locally cubically, without the need to compute higher derivatives. The case of multiple roots is not treated. Per iteration the method requires one evaluation of the function vector and solving two linear systems with the Jacobian as coefficient matrix, where the Jacobian has to be evaluated twice. Since the additional computational effort is nearly that of an additional Newton step, the proposed method is useful especially in difficult cases where the number of iterations can be reduced by a factor of two in comparison to the Newton method. This much better convergence is indeed possible as shown by a numerical example. Also, the modified Newton method can be advantageous in cases where the evaluation of the function is more expensive than solving a linear system with the Jacobian as coefficient matrix. An example for this is given where numerical quadrature is involved. Finally, we discuss shortly possible extensions of the method to make it globally convergent.  相似文献   

15.
Based on the modified secant equation, we propose two new HS type conjugate gradient formulas. Their forms are similar to the original HS conjugate gradient formula and inherit all nice properties of the HS method. By utilizing the technique of the three-term HS method in Zhang et al. (2007) [15], without the requirement of truncation and convexity of the objective function, we show that one with Wolfe line search and the other with Armijo line search are globally convergent. Moreover, under some mild conditions, the linear convergence rate of the two modified methods is established. The numerical results show that the proposed methods are efficient.  相似文献   

16.
一个三阶牛顿变形方法   总被引:3,自引:2,他引:1  
基于反函数建立的积分方程,结合Simpson公式,给出了一个非线性方程求根的新方法,即为牛顿变形方法.证明了它至少三次收敛到单根,与牛顿法相比,提高了收敛阶和效率指数.文末给出数值试验,且与牛顿法和同类型牛顿变形法做了比较.结果表明方法具有较好的优越性,它丰富了非线性方程求根的方法.  相似文献   

17.
Hermitian and skew-Hermitian splitting(HSS) method has been proved quite successfully in solving large sparse non-Hermitian positive definite systems of linear equations. Recently, by making use of HSS method as inner iteration, Newton-HSS method for solving the systems of nonlinear equations with non-Hermitian positive definite Jacobian matrices has been proposed by Bai and Guo. It has shown that the Newton-HSS method outperforms the Newton-USOR and the Newton-GMRES iteration methods. In this paper, a class of modified Newton-HSS methods for solving large systems of nonlinear equations is discussed. In our method, the modified Newton method with R-order of convergence three at least is used to solve the nonlinear equations, and the HSS method is applied to approximately solve the Newton equations. For this class of inexact Newton methods, local and semilocal convergence theorems are proved under suitable conditions. Moreover, a globally convergent modified Newton-HSS method is introduced and a basic global convergence theorem is proved. Numerical results are given to confirm the effectiveness of our method.  相似文献   

18.
本文基于分式逼近提出了一类求解单变量无约束优化问题的新割线法,给出并证明了该方法的收敛阶是(√2+1).并进一步对新方法的性能进行了分析,给出了新方法、经典的牛顿法和其他修正的割线类方法解单变量无约束优化问题的数值实验.理论和数值结果均表明新的割线法是有效的.  相似文献   

19.
§ 1  IntroductionWe consider the following equality contrained minimization problem:minf(x)s.t.c(x) =0 , (1 .1 )wheref :Rn→ R1 and c:Rn→ Rmare twice continuously differentiable.In[1 ] ,Fontecilla proposed the secant methods and proved their localq -superlineartwo-step convergence property under certain conditions.However,the global convergenceis not concerned in this paper,since Fontecilla did not discuss the important question ofensuring progress towards the solution of the problem(1 .1…  相似文献   

20.
《Optimization》2012,61(12):2229-2246
ABSTRACT

A secant equation (quasi-Newton) has one of the most important rule to find an optimal solution in nonlinear optimization. Curvature information must satisfy the usual secant equation to ensure positive definiteness of the Hessian approximation. In this work, we present a new diagonal updating to improve the Hessian approximation with a modifying weak secant equation for the diagonal quasi-Newton (DQN) method. The gradient and function evaluation are utilized to obtain a new weak secant equation and achieve a higher order accuracy in curvature information in the proposed method. Modified DQN methods based on the modified weak secant equation are globally convergent. Extended numerical results indicate the advantages of modified DQN methods over the usual ones and some classical conjugate gradient methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号