首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到10条相似文献,搜索用时 109 毫秒
1.
In this paper, we first present an adaptive nonmonotone term to improve the efficiency of nonmonotone line search, and then an active set identification technique is suggested to get more efficient descent direction such that it improves the local convergence behavior of algorithm and decreases the computation cost. By means of the adaptive nonmonotone line search and the active set identification technique, we put forward a global convergent gradient-based method to solve the nonnegative matrix factorization (NMF) based on the alternating nonnegative least squares framework, in which we introduce a modified Barzilai-Borwein (BB) step size. The new modified BB step size and the larger step size strategy are exploited to accelerate convergence. Finally, the results of extensive numerical experiments using both synthetic and image datasets show that our proposed method is efficient in terms of computational speed.  相似文献   

2.
An active set subspace Barzilai-Borwein gradient algorithm for large-scale bound constrained optimization is proposed. The active sets are estimated by an identification technique. The search direction consists of two parts: some of the components are simply defined; the other components are determined by the Barzilai-Borwein gradient method. In this work, a nonmonotone line search strategy that guarantees global convergence is used. Preliminary numerical results show that the proposed method is promising, and competitive with the well-known method SPG on a subset of bound constrained problems from CUTEr collection. This work was supported by the 973 project granted 2004CB719402 and the NSF project of China granted 10471036.  相似文献   

3.
In this paper, we consider a multivariate spectral projected gradient (MSPG) method for bound constrained optimization. Combined with a quasi-Newton property, the multivariate spectral projected gradient method allows an individual adaptive step size along each coordinate direction. On the basis of nonmonotone line search, global convergence is established. A numerical comparison with the traditional SPG method shows that the method is promising.  相似文献   

4.
The Barzilai–Borwein (BB) gradient method has received many studies due to its simplicity and numerical efficiency. By incorporating a nonmonotone line search, Raydan (SIAM J Optim. 1997;7:26–33) has successfully extended the BB gradient method for solving general unconstrained optimization problems so that it is competitive with conjugate gradient methods. However, the numerical results reported by Raydan are poor for very ill-conditioned problems because the effect of the degree of nonmonotonicity may be noticeable. In this paper, we focus more on the nonmonotone line search technique used in the global Barzilai–Borwein (GBB) gradient method. We improve the performance of the GBB gradient method by proposing an adaptive nonmonotone line search based on the morphology of the objective function. We also prove the global convergence and the R-linear convergence rate of the proposed method under reasonable assumptions. Finally, we give some numerical experiments made on a set of unconstrained optimization test problems of the CUTEr collection. The results show the efficiency of the proposed method in the sense of the performance profile introduced (Math Program. 2002;91:201–213) by Dolan and Moré.  相似文献   

5.
The conjugate gradient method is a useful and powerful approach for solving large-scale minimization problems. Liu and Storey developed a conjugate gradient method, which has good numerical performance but no global convergence under traditional line searches such as Armijo line search, Wolfe line search, and Goldstein line search. In this paper we propose a new nonmonotone line search for Liu-Storey conjugate gradient method (LS in short). The new nonmonotone line search can guarantee the global convergence of LS method and has a good numerical performance. By estimating the Lipschitz constant of the derivative of objective functions in the new nonmonotone line search, we can find an adequate step size and substantially decrease the number of functional evaluations at each iteration. Numerical results show that the new approach is effective in practical computation.  相似文献   

6.
On the Nonmonotone Line Search   总被引:10,自引:0,他引:10  
The technique of nonmonotone line search has received many successful applications and extensions in nonlinear optimization. This paper provides some basic analyses of the nonmonotone line search. Specifically, we analyze the nonmonotone line search methods for general nonconvex functions along different lines. The analyses are helpful in establishing the global convergence of a nonmonotone line search method under weaker conditions on the search direction. We explore also the relations between nonmonotone line search and R-linear convergence assuming that the objective function is uniformly convex. In addition, by taking the inexact Newton method as an example, we observe a numerical drawback of the original nonmonotone line search and suggest a standard Armijo line search when the nonmonotone line search condition is not satisfied by the prior trial steplength. The numerical results show the usefulness of such suggestion for the inexact Newton method.  相似文献   

7.
In this paper, an adaptive nonmonotone line search method for unconstrained minimization problems is proposed. At every iteration, the new algorithm selects only one of the two directions: a Newton-type direction and a negative curvature direction, to perform the line search. The nonmonotone technique is included in the backtracking line search when the Newton-type direction is the search direction. Furthermore, if the negative curvature direction is the search direction, we increase the steplength under certain conditions. The global convergence to a stationary point with second-order optimality conditions is established. Some numerical results which show the efficiency of the new algorithm are reported.   相似文献   

8.
A derivative-free iterative scheme that uses the residual vector as search direction for solving large-scale systems of nonlinear monotone equations is presented. It is closely related to two recently proposed spectral residual methods for nonlinear systems which use a nonmonotone line-search globalization strategy and a step-size based on the Barzilai-Borwein choice. The global convergence analysis is presented. In order to study the numerical behavior of the algorithm, it is included an extensive series of numerical experiments. Our computational experiments show that the new algorithm is computationally efficient.  相似文献   

9.
In this paper, we develop an active set identification technique. By means of the active set technique, we present an active set adaptive monotone projected Barzilai-Borwein method (ASAMPBB) for solving nonnegative matrix factorization (NMF) based on the alternating nonnegative least squares framework, in which the Barzilai-Borwein (BB) step sizes can be adaptively picked to get meaningful convergence rate improvements. To get optimal step size, we take into account of the curvature information. In addition, the larger step size technique is exploited to accelerate convergence of the proposed method. The global convergence of the proposed method is analysed under mild assumption. Finally, the results of the numerical experiments on both synthetic and real-world datasets show that the proposed method is effective.  相似文献   

10.
Adaptive Two-Point Stepsize Gradient Algorithm   总被引:7,自引:0,他引:7  
Combined with the nonmonotone line search, the two-point stepsize gradient method has successfully been applied for large-scale unconstrained optimization. However, the numerical performances of the algorithm heavily depend on M, one of the parameters in the nonmonotone line search, even for ill-conditioned problems. This paper proposes an adaptive nonmonotone line search. The two-point stepsize gradient method is shown to be globally convergent with this adaptive nonmonotone line search. Numerical results show that the adaptive nonmonotone line search is specially suitable for the two-point stepsize gradient method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号