首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   205篇
  免费   2篇
  国内免费   3篇
力学   2篇
数学   204篇
物理学   4篇
  2023年   4篇
  2022年   4篇
  2021年   1篇
  2019年   6篇
  2018年   4篇
  2017年   4篇
  2016年   1篇
  2014年   2篇
  2013年   13篇
  2012年   9篇
  2011年   4篇
  2010年   3篇
  2009年   13篇
  2008年   15篇
  2007年   13篇
  2006年   12篇
  2005年   12篇
  2004年   8篇
  2003年   3篇
  2002年   11篇
  2001年   8篇
  2000年   11篇
  1999年   5篇
  1998年   8篇
  1997年   4篇
  1996年   3篇
  1995年   8篇
  1994年   3篇
  1993年   3篇
  1992年   1篇
  1991年   1篇
  1990年   1篇
  1985年   2篇
  1982年   4篇
  1981年   1篇
  1980年   3篇
  1977年   1篇
  1975年   1篇
排序方式: 共有210条查询结果,搜索用时 15 毫秒
1.
We study convergence properties of a modified subgradient algorithm, applied to the dual problem defined by the sharp augmented Lagrangian. The primal problem we consider is nonconvex and nondifferentiable, with equality constraints. We obtain primal and dual convergence results, as well as a condition for existence of a dual solution. Using a practical selection of the step-size parameters, we demonstrate the algorithm and its advantages on test problems, including an integer programming and an optimal control problem. *Partially Supported by 2003 UniSA ITEE Small Research Grant Ero2. Supported by CAPES, Brazil, Grant No. 0664-02/2, during her visit to the School of Mathematics and Statistics, UniSA.  相似文献   
2.
On affine scaling algorithms for nonconvex quadratic programming   总被引:8,自引:0,他引:8  
We investigate the use of interior algorithms, especially the affine-scaling algorithm, to solve nonconvex — indefinite or negative definite — quadratic programming (QP) problems. Although the nonconvex QP with a polytope constraint is a hard problem, we show that the problem with an ellipsoidal constraint is easy. When the hard QP is solved by successively solving the easy QP, the sequence of points monotonically converge to a feasible point satisfying both the first and the second order optimality conditions.Research supported in part by NSF Grant DDM-8922636 and the College Summer Grant, College of Business Administration, The University of Iowa.  相似文献   
3.
ABSTRACT

Local sensitivity information is obtained for KKT points of parametric NLPs that may exhibit active set changes under parametric perturbations; under appropriate regularity conditions, computationally relevant generalized derivatives of primal and dual variable solutions of parametric NLPs are calculated. Ralph and Dempe obtained directional derivatives of solutions of parametric NLPs exhibiting active set changes from the unique solution of an auxiliary quadratic program. This article uses lexicographic directional derivatives, a newly developed tool in nonsmooth analysis, to generalize the classical NLP sensitivity analysis theory of Ralph and Dempe. By viewing said auxiliary quadratic program as a parametric NLP, the results of Ralph and Dempe are applied to furnish a sequence of coupled QPs, whose unique solutions yield generalized derivative information for the NLP. A practically implementable algorithm is provided. The theory developed here is motivated by widespread applications of nonlinear programming sensitivity analysis, such as in dynamic control and optimization problems.  相似文献   
4.
In this paper we present augmented Lagrangians for nonconvex minimization problems with equality constraints. We construct a dual problem with respect to the presented here Lagrangian, give the saddle point optimality conditions and obtain strong duality results. We use these results and modify the subgradient and cutting plane methods for solving the dual problem constructed. Algorithms proposed in this paper have some advantages. We do not use any convexity and differentiability conditions, and show that the dual problem is always concave regardless of properties the primal problem satisfies. The subgradient of the dual function along which its value increases is calculated without solving any additional problem. In contrast with the penalty or multiplier methods, for improving the value of the dual function, one need not to take the penalty like parameter to infinity in the new methods. In both methods the value of the dual function strongly increases at each iteration. In the contrast, by using the primal-dual gap, the proposed algorithms possess a natural stopping criteria. The convergence theorem for the subgradient method is also presented.  相似文献   
5.
Many margin-based binary classification techniques such as support vector machine (SVM) and ψ-learning deliver high performance. An earlier article proposed a new multicategory ψ-learning methodology that shows great promise in generalization ability. However,ψ-learning is computationally difficult because it requires handling a nonconvex minimization problem. In this article, we propose two computational tools for multicategory ψ-learning. The first one is based on d.c. algorithms and solved by sequential quadratic programming, while the second one uses the outer approximation method, which yields the global minimizer via sequential concave minimization. Numerical examples show the proposed algorithms perform well.  相似文献   
6.
We develop criteria for the existence and uniqueness of the global minima of a continuous bounded function on a noncompact set. Special attention is given to the problem of parameter estimation via minimization of the sum of squares in nonlinear regression and maximum likelihood. Definitions of local convexity and unimodality are given using the level set. A fundamental theorem of nonconvex optimization is formulated: If a function approaches the minimal limiting value at the boundary of the optimization domain from below and its Hessian matrix is positive definite at the point where the gradient vanishes, then the function has a unique minimum. It is shown that the local convexity level of the sum of squares is equal to the minimal squared radius of the regression curvature. A new multimodal function is introduced, the decomposition function, which can be represented as the composition of a convex function and a nonlinear function from the argument space to a space of larger dimension. Several general global criteria based on majorization and minorization functions are formulated.  相似文献   
7.
We consider a recent branch-and-bound algorithm of the authors for nonconvex quadratic programming. The algorithm is characterized by its use of semidefinite relaxations within a finite branching scheme. In this paper, we specialize the algorithm to the box-constrained case and study its implementation, which is shown to be a state-of-the-art method for globally solving box-constrained nonconvex quadratic programs. S. Burer was supported in part by NSF Grants CCR-0203426 and CCF-0545514.  相似文献   
8.
We consider the minimization problem of an integral functional with integrand that is not convex in the control on solutions of a control system described by fractional differential equation with mixed nonconvex constraints on the control. A relaxation problem is treated along with the original problem. It is proved that, under general assumptions, the relaxation problem has an optimal solution, and that for each optimal solution there is a minimizing sequence of the original problem that converges to the optimal solution with respect to the trajectory, the control, and the functional in appropriate topologies simultaneously.  相似文献   
9.
A counter-example is given to several recently published results on duality bound methods for nonconvex global optimization.  相似文献   
10.
As computing resources continue to improve, global solutions for larger size quadrically constrained optimization problems become more achievable. In this paper, we focus on larger size problems and get accurate bounds for optimal values of such problems with the successive use of SDP relaxations on a parallel computing system called Ninf (Network-based Information Library for high performance computing).  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号