首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Dynamic programming in stochastic control of systems with delay
Authors:Bjørnar Larssen
Institution:School of Business, Faculty of Business, Public Administration and Social Work , Oslo University College , Pilestredet 56, Oslo, N-0167, Norway
Abstract:

We consider optimal control problems for systems described by stochastic differential equations with delay (SDDE). We prove a version of Bellman's principle of optimality (the dynamic programming principle) for a general class of such problems. That the class in general means that both the dynamics and the cost depends on the past in a general way. As an application, we study systems where the value function depends on the past only through some weighted average. For such systems we obtain a Hamilton-Jacobi-Bellman partial differential equation that the value function must solve if it is smooth enough. The weak uniqueness of the SDDEs we consider is our main tool in proving the result. Notions of strong and weak uniqueness for SDDEs are introduced, and we prove that strong uniqueness implies weak uniqueness, just as for ordinary stochastic differential equations.
Keywords:Stochastic Delay Equations  Optimal Stochastic Control  Dynamic Programming  Hamilton-Jacobi-Bellman Equations
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号