Abstract: | ![]() The control problem is considered with minimization of the guaranteed result for a system described by an ordinary differential equation in the presence of uncontrolled noise. The concepts and formulation of the problem in /1/ are used. It is shown that, when forming the optimal control by the method of programmed stochastic synthesis /1–3/, the extremal shift at the accompanying point /1, 4/ can be reduced to extremal shift agianst the gradient of the appropriate function. This explains the connection between the programmed stochastic synthesis and the generalized Hamilton-Jacobi equation /5, 6/ in the theory of differential games. |