排序方式: 共有5条查询结果,搜索用时 968 毫秒
1
1.
We present a new matrix-free method for the computation of negative curvature directions based on the eigenstructure of minimal-memory BFGS matrices. We determine via simple formulas the eigenvalues of these matrices and we compute the desirable eigenvectors by explicit forms. Consequently, a negative curvature direction is computed in such a way that avoids the storage and the factorization of any matrix. We propose a modification of the L-BFGS method in which no information is kept from old iterations, so that memory requirements are minimal. The proposed algorithm incorporates a curvilinear path and a linesearch procedure, which combines two search directions; a memoryless quasi-Newton direction and a direction of negative curvature. Results of numerical experiments for large scale problems are also presented. 相似文献
2.
一种迭代的小光斑LiDAR波形分解方法 总被引:1,自引:0,他引:1
针对传统LiDAR波形数据分解方法受噪声影响严重、对复杂重叠及微弱回波分解能力不足的缺点,提出了一种新波形分解方法.通过计算滤波前后波形的幅值变化,估计波形的随机与背景噪声; 采用逐层剥离的策略,从原始波形数据中不断分解出波形分量,直到剩余波形中最大峰值小于一定的阈值; 利用L-BFGS算法优化初始参数,获得波形分量参数的最优解; 最后对位置过近的波形分量进行合并.该方法计算速度快,探测微弱回波能力强,显著提高分解后点云的密度与精度.对大量LiDAR波形数据进行了分解,验证了其有效性. 相似文献
3.
This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (L-BFGS) and a Hessian-free Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about the objective function is stored in the form of a limited memory matrix, and plays the dual role of preconditioning the inner conjugate gradient iteration in the HFN method and of providing an initial matrix for L-BFGS iterations. The lengths of the L-BFGS and HFN cycles are adjusted dynamically during the course of the optimization. Numerical experiments indicate that the new algorithms are both effective and not sensitive to the choice of parameters. 相似文献
4.
Dacian N. Daescu 《International Journal of Computational Fluid Dynamics》2013,27(4):299-306
In four-dimensional variational data assimilation (4D-Var) an optimal estimate of the initial state of a dynamical system is obtained by solving a large-scale unconstrained minimization problem. The gradient of the cost functional may be efficiently computed using the adjoint modeling, at the expense equivalent to a few forward model integrations; for most practical applications, the evaluation of the Hessian matrix is not feasible due to the large dimension of the discrete state vector. Hybrid methods aim to provide an improved optimization algorithm by dynamically interlacing inexpensive L-BFGS iterations with fast convergent Hessian-free Newton (HFN) iterations. In this paper, a comparative analysis of the performance of a hybrid method vs. L-BFGS and HFN optimization methods is presented in the 4D-Var context. Numerical results presented for a two-dimensional shallow-water model show that the performance of the hybrid method is sensitive to the selection of the method parameters such as the length of the L-BFGS and HFN cycles and the number of inner conjugate gradient iterations during the HFN cycle. Superior performance may be obtained in the hybrid approach with a proper selection of the method parameters. The applicability of the new hybrid method in the framework of operational 4D-Var in terms of computational cost and performance is also discussed. 相似文献
5.
1