×

Stabilization of a linear system via rotational control. (English) Zbl 0801.93033

Summary: We consider a linear stochastic differential equation of the form \[ dx_ t= \left( {{a\;0} \brack { 0\;b}} +u \left[{0\atop 1} {-1\atop 0}\right] \right) x_ t dt+\sigma \left[{0\atop 1} {-1\atop 0}\right] x_ t\circ dW_ t \] where \(u\) is a control which is bounded by \(\pm K\) with the objective to minimize \(\| x_ t\|\).
We first restrict ourselves to the finite time interval \([0,T]\) and we find explicitly, using Haussmann’s maximum principle, a bang-bang control which is optimal to minimize \({\mathbf E}\{f(\| x_ T\|)\}\) for suitable increasing functions \(f\).
Then the asymptotic behaviour of the controlled stochastic differential equation is studied. We use Lyapunov exponents to show that for \(K\) big enough the process \(\{\| x_ t\|\): \(t\geq 0\}\) can be stabilized in an almost sure sense as well as in a \(L^ p\) sense. We also give some results about the dependence of the Lyapunov exponent or moment Lyapunov exponent on the parameters of the stochastic differential equation.

MSC:

93C05 Linear systems in control theory
93D05 Lyapunov and other classical stabilities (Lagrange, Poisson, \(L^p, l^p\), etc.) in control theory
93E03 Stochastic systems in control theory (general)