0% found this document useful (0 votes)
65 views18 pages

A Survey of Stability of Stochastic Systems

Uploaded by

jap_thaygor5925
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views18 pages

A Survey of Stability of Stochastic Systems

Uploaded by

jap_thaygor5925
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Automatica, Vol. 5, pp. 95-112. Pergamon Press, 1969. Printed in Great Britain.

A Survey of Stability of Stochastic Systems"


F. K O Z I N t

Stochastic systems are becoming extensively used as realistic models of physical


phenomena. Stability is an important qualitative property of the dynamics of such
systems, and here the major contributions of students of the subject are surveyed.

(This paper, and the less formal article which follows, are to a point where the results of the various invest-
two of four presented at a technical session, "Stochastic
Problems in Control," which was held on 28 June 1968 at igations can be applied to the study of many
the annual Joint American Control Conference, JACC, of practical problems in the general engineering
the American Automatic Control Council, AACC, in Ann sciences. Moreover, we will most likely see the
Arbor, Michigan, where an IFAC Executive Council meeting
was also held. This special and well attended session scope of these applications increase significantly
featured invited papers; it was organized by Bernard Fried- in the not too distant future. For example, biolog-
land of General Precision Inc., New Jersey, U.S.A.; and it ical systems which often possess an almost sure
was sponsored by the American Society of Mechanical
Engineers, ASME, from which permission was granted to type of stability, referred as "ultra-stability" by
publish these articles. Although all four papers were con- W. R. ASHBY in the book "Design for a Brain",
sidered excellent, not all of them could be published here. is an area in which stochastic stability is sure to
However, they all may be purchased in one booklet from
the ASME, United Engineering Center, 345 East 47th Street, play a major role.
New York, New York, U.S.A.----editor). Stability, under its many definitions, appears
to be a qualitative property basic to all systems
Summary--The main purpose of this manuscript is to give natural as well as man made. As we continue to
to the interested reader some understanding of the subject
of stability of stochastic systems. We present some of the learn to formulate and apply, in a meaningful way,
basic ideas as well as a survey of results that have appeared stochastic models of systems, stability studies of
in the literature. Theorems are presented, where appro- these models will only increase in their importance.
priate, but no proofs are given. It is hoped that the interested
reader will go to the original source for the full development In this manuscript we shall attempt to present
of the ideas involved. We restrict ourselves to continuous some of the ideas as well as a survey of results
parameter models and distinguish between those systems concerning stochastic stability. It is assumed that
that do incorporate Gaussian white noise in the system
equations, and those systems that do not incorporate a large portion of the readers may not have any
Gaussian white noise. A reasonably representative biblio- experience with the more fundamental ideas of
graphy is included. stochastic process theory. Thus we have attempted
I. INTRODUCTION to concentrate on what is known about the various
classes of stochastic systems whose stability pro-
SINCE the 1940's, the theory of stochastic processes
perties have been investigated. Theorems shall be
and random function analysis has developed to
stated where appropriate, but no proofs will be
such a large extent, as they relate to applied prob-
given. The reader may go to the original source
lems, that portions of these topics are expected
for the full development of the ideas involved.
to be included among the analytical tools of all
Naturally, we cannot hope to cover the entire
modern day engineering graduates. Thus, as
subject, whose literature has grown so greatly in
stochastic models have come to be more fully
recent years. But, if we can generate the feeling
understandable to engineers and scientists, the study
that, after having read this paper, the reader has a
of rather important stochastic system properties
better idea of what the subject is about and what
has become possible. Among these is the property
the problems of the subject are, then we shall have
of stability.
accomplished our purpose.
The advances that have occurred in the study
We will restrict ourselves to continuous para-
of stability of stochastic systems within the past
meter models. In section II we define various
7 or 8 years have been quite formidable. Major
concepts of stochastic stability. In section III
efforts by a large body of investigators, especially
we shall discuss systems that do not incorporate
in U.S.A. and U.S.S.R., have brought the subject
the Gaussian white noise in the system equations.
* Received in revised form 28 July 1968. Recommended This will distinguish the class of systems in section
for possible publication by associate editor P. Dorato. The III from those discussed in section IV which do
work was supported by NSF Grant GU-1557.
t Polytechnic Institute of Brooklyn, Brooklyn, N.Y., incorporate the Gaussian White noise in the system
U.S.A. equations.

95
96 F. KOZIN

11. CONCEPTS OF STOCHASTIC bability of the process as t approaches inlinity.


STABILITY That is we are interested in the quantity
Stability is a qualitative property of the solutions
to differential equations, which can often be studied lira p{Ix(t)l >e}. (2.2)
t~cO
without a direct recourse to solving the equations.
Stability concepts are usually defined in terms of For t e(n-1, n) it is clear that Ix(01>0 if and
convergence relative to parameters such as the only ifp,,c{t-[1/(2n2)], t+ [1/(2n2)]} which has pro-
initial conditions, or the time parameter. Tile bability [1/(n 2 - 1)] of occurring. Thus, for te(n- l,
literature on the topic is abundant with the many n), it follows that
concepts of stability that have been studied. These
p{[x(t)l>~}<[1/(nZ-1)] . (2.3)
stability concepts have, in general, been derived
for the study of deterministic systems. It follows Therefore, (2.2) is equal to zero and the process
that there are at least as many stability concepts converges asymptotically to zero in probability. But,
for the study of stochastic systems as there are the samples themselves exceed any 0 < e < 1 in every
for the study of deterministic systems. The reason interval. Hence, the samples do not converge to zero.
is quite straightforward. The deterministic con- They are the train of triangular spikes of unit height.
cepts of stability have their counterparts in each Thus, it is clear from this simple example that
of the common modes of convergence of prob- asymptotic convergence in probability may not be
ability theory. significant as a test of system behavior from a
The reader may recall that the common modes practical point of view. The reader may recognize
of convergence to be found in the texts are con- that (2.2) sees the sample function properties only
vergence in probability, convergence in the mean at one point, t, rather than on the interval (t, m),
and almost sure convergence. Thus, it is clear that which is the heart of the problem here. This
there are at least three times as many concepts of distinction will be discussed later.
stability as for the usual deterministic case. Indeed, One can trace the study of stability of stochastic
there are even more. Of course the reader can well systems back to the 1933 paper of ANDRONOW,
imagine that not all of the possible definitions of WITT and POrqTRYAGIN [1]. They studied one and
stochastic stability will be of interest simply because two dimensional nonlinear systems driven by the
they may be too weak to be of practical significance. Gaussian White noise. Under various smoothness
The following example of convergence in pro- conditions, such systems give rise to a forward
bability illustrates this point. diffusion equation (Fokker-Planck equation [2])
whose solution is the joint density of the state
Example 2.1 variables of the system. They were interested in
Consider the process {x(t), t e (0, oo)} defined the asymptotic properties as t approaches in-
on each unit interval n - l <t<n, n = l , 2 , . . . , as finity, and were able to solve for the time indepen-
dent joint density in a number of cases. This density
can be expressed as an exponential of the Hamil-
tonian of the unforced deterministic system. It
readily follows that the relative maxima and minima
1 of the stationary joint density function will occur
for p , - ~ n e < t < p , at the stable and unstable equilibrium points,
respectively, of the deterministic system. Thus
x(t) = on a probabilistic basis one can say that the most
(2.1)
probable asymptotic states are the stable equil-
ibrium points of the deterministic system and the
1 least probable asymptotic states are the unstable
for p , < t < P , + 2 n i equilibrium points of the system. A similar situ-
ation occurs along the stable and unstable limit
0, otherwise, cycles of the deterministic system.
ANDRONOWet al. [1] were motivated by Lyapunov
where {p,} is a sequence of independent random and the, then current, fundamental work of Kol-
variables for which p, is uniformly distributed on mogoroff on Markov processes and diffusion
I n - 1 + l[2n 2, n - 1/2n2], n = 1, 2 . . . . . equations. They were clearly interested in stoch-
The sample functions of the process (2.1) are astic system stability properties and their paper
trains of triangular pulses each of unit height, apparently was the first of its type. However,
each lying entirely within a unit interval. The they, also, restricted themselves to studying the
pulse in ( n - 1 , n) has base length 1In 2. We are probabilistie properties of the sample functions
interested in the asymptotic convergence in pro- at single time parameter values.
A survey of stability of stochastic systems 97

Since a large portion of the literature has been As such, one can expect stochastic stability concepts
devoted to the Lyapunov concept of stability, we in terms of (2.6) to be much stronger than those,
shall turn now to this topic and consider in some for example, discussed above in Ref. [1]. This will
detail various stochastic analogues of the concept become clear below. Writing Definitions I and II
of Lyapunov stability. The definitions are rather in their stochastic versions gives the following
straightforward to transcribe. definitions.
This is accomplished simply by changing the
Definition Ip. Lyapunov stability in probability
modes of convergence as they appear in the con-
The equilibrium solution is stable in probability
cepts of Lyapunov stability for deterministic
if given e, e'>0, there exists 6(5, e', to) such that
systems. We state here, for reference, the concept
of Lyapunov stability for deterministic systems l[xoll < 6 implies
[3]. We shall always refer to the equilibrium or null P{sup [Ix(t; xo, to)[]>e'} <5. (2.7)
t>--to
solution, x--0, as the solution whose stability
properties are being tested, Xo will denote the
Definition Im. Lyapunov stability in the mth mean
initial state at the initial time t o. We will denote the
The equilibrium solution is stable in the mth
solution with initial state x° at time to, by x(t; Xo, to),
mean if the ruth moments of solution vector exists
which is assumed to be an n-vector. Finally, unless
and given e>0, there exists 6(5, to) such that
specified otherwise, Ilxl[ will denote
IIxo[l°, < 6 implies

Ix,I, E{sup [Ix(t; Xo, to)Hm~}<e (2.8)


i=1 t>to

where
the simple absolute value norm.
N

Definition I. Lyapunov stabifity Ilyll: = ly, Im"


The equilibrium solution is said to be stable if,
given e > 0 , there exists 6(e, to)>0 such that for Definition IA.S. Almost sure Lyapunov stability
[Ixoll<6, it follows that The equilibrium solution is said to be ahnost
surely stable if
sup I[x(t); Xo, to)ll <5. (2.4)
t~to
P{ lim suPllx(t; xo, to)l[=o}=l. (2.9)
llxo[I"-'0 t>to
Definition II. Asymptotic Lyapunov stability
The equilibrium solution is said to be asymp- Thus, Definition In.s. says that the equilibrium
totically stable, if it is stable and if there exists 6' solution is stable for almost all sample systems.
such that Ilxoll <~' implies This is the same as saying that Definition I holds
with probability one.
lira [Ix(t; Xo, to)[I=0. (2.5) The expression (2.9) can be re-written, equiv-
t-~O0
alently, as
If (2.4) holds for any to, the stability is said to be limP{ sup sup ]Ix(t; Xo, to)]l>5}=0. (2.10)
uniform. If (2.5) holds for any Xo, the equilibrium ~-~o Ilxoll <tS t>to
solution is said to be asymptotically stable in the
For asymptotic stability in tile various modes
large.
defined above, we shall append the following
In order to make the transition to stochastic
conditions.
stability concepts, we merely rewrite the converg-
ence of Definition I in the three modes of converg- Definition lip. Asymptotic stability in probability
ence of probability theory. Notice that the variable The equilibrium solution is siad to be asymptotic-
whose convergence is being studied in (2.4) is ally stable in probability if Ip holds and if there exists
6'> 0 such that IIxoll< 6' implies
sup ]Ix(t; Xo, to)l[. (2.6)
limP{sup]Ix(t; Xo, to)]l>~}=0. (2.11)
t>-to

It is precisely this random variable, whose limiting


Definition IIm. Asymptotic stability in the mth mean
probabilistic properties as a function of the initial
The equilibrium solution is said to asymp-
state Xo are being studied, that will determine
totically stable in the mth mean if In, holds and
the stability of the equilibrium solution in the
there exists 6 ' > 0 such that I[xoIl<6' implies
various stochastic senses. It must be emphasized
that (2.6) is a variable depending upon the behavior
of the sample paths on the entire half line (t o, 00). lira E{sup ]Ix(t; Xo, to)]l~} = 0 . (2.12)
98 F. KOZIN

Definition IIA.s. Almost sure asymptotic Lyapunov seemingly weaker stability definitions do have
stability significant implications for sample stability. Thor-
The equilibrium solution is said to be almost ough discussions of the many implications that
surely asymptotically stable if Definition IA.S. holds exist among the stability concepts discussed in
and there exists 6 ' > 0 such that I[xo[I< 6 ' implies this section may be found in [5-7].
for any e > 0, We have by no means exhausted the possible
stability concepts by the definitions above. For
lim P{sup [[x(t; xo, to)l[>e}=0. (2.13) example one might require only that the 2nd
moments be asymptotically bounded, a concept
introduced by SAMUELS[8] and called "mean square
Although the stability definitions that we have stability", or, one might ask for the existence of
presented above are direct analogies of the concept bounded trajectories, a type of LaGrange stability.
of Lyapunov stability and are concerned with WONHAM [9] has introduced a stochastic counter-
sample behavior on the infinite half line (to, ~ ) , part that he refers to as "weak stochastic stability",
a majority of the investigations in the literature which asks that any n-dimensional sphere in
have been concerned with stability properties state space is intersected with probability one from
of the moments as well as the distributions of the any initial state. All of these concepts have been
solution processes. studied and, clearly, many more can bc studied
The literature contains a great deal of investi- within the scope of this rather rich topic. The
gations of the stability concepts that follow. stability concepts IIIp, Illm, IV as well as bounded
moments were studied rather heavily before such
Definition IIIp. Lyapunov stability of the probability stability concepts as Ir--IA.s., IIp--IIA.s. were
The equilibrium solution possesses stability of looked into. The reason is simply that the tech-
the probability if given e, e ' > 0 there exists 6(e, niques for looking at the latter concepts of stability
c.', to)>0 such that [Ixol[<6 implies were not fully understood or developed by the
people interested in the topic until the early 1960's.
P{Hx(t; xo, (2.14) Furthermore, although it may not always be possi-
ble to solve for the moments, one can often
approximate their values so that Illm, 1V, as well
Definition IIIm. Lyapunov stability of the ruth mean as mean square stability have been given quite a
The equilibrium solution is said to possess bit of attention. This contrasts with the fact that
stability of the mth mean if given e > 0, there exists the techniques required to study the random
,5(e, to) such that Ilxo[I < ~ implies variable (2.6), for example, are rather sophisticated
results of stochastic process theory. It is also
E(llx(t; Xo, to)ll~}<~ (2.15) fair to say that the body of engineering researchers
that would be interested in stability properties were
The asymptotic stability definitions can easily more closely associated with moment properties
be supplied by the reader. All that is required and probabilities, without being motivated to
is that the supremum is deleted in the definitions studying sample properties. This is somehow
lip, II m. strange when it is realized that upon observing
One further asymptotic stability concept that a system in operation it is a sample that is wit-
has been studied for stochastic systems is ex- nessed not an average or a probability. BERTRAM
ponential stability of the mean [4, 5]. and SARACHIK recognized this [10] in their 1959
paper.
Which of all the stability concepts wc have
Definition IV. Exponential stability of the mth mean mentioned is most useful, or most significant?
The equilibrium solution is said to possess This must depend in the last analysis upon the
exponential stability of the mth mean if there exists problem being studied and the qualitative pro-
6 > 0 and constants ~, fl, such that Ilxoll < implies perties desired by the investigator. KUSHNER [11]
for all t > to has stated, with justification, that the proper
concepts of stochastic stability as well as use of
E{jlx(t; xo, to)llm}<~Pllxoll~expE-~(t-to)'l (2.16) descriptive terminology remains to be settled
as the subject develops. However, we have and
These last three definitions depend upon the still do maintain the view that for applications to
properties of the sample solutions at one time only. real systems, we desire stability properties as
Hence, it is immediately obvious that they are close to deterministic stability, as possible. Thus,
not as strong as the stability concepts based upon we continue to seek conditions that will guarantee
(2.6). However, under certain circumstances these almost sure sample stability properties.
A survey of stability of stochastic systems 99

IlL STOCHASTIC SYSTEMS: ORDINARY As yet we do not have a solution process, but
EQUATIONS
this is easily remedied. We merely have to place
The extensive body of specific results that have a probability measure on the collection F. This
been obtained in the subject of stability of stoch- can be accomplished almost arbitrarily, but for
astic systems can be categorized in a number of purposes of later demonstrations we define the
ways. For the purposes of our brief survey of probability as
results, in a somewhat chronological order, we
have elected to classify the results relative to the
nature of the generated solution processes. Thus
we shall distinguish between those systems equa-
{
P f(')=fn(') n6+~

tions that have Gaussian white noise coefficients for some ~5, 0 < 6 < 1, where
and those systems that do not. The system equations
having Gaussian white noise coefficients yield 1 (3.4)
g = n= n'~-4-~ "
solution processes that are diffusion processes
and are amenable to analysis by the theory of
Markov Processes. However, as the reader re- The probability defined in (3.4) obviously induces
cognizes by now, such system equations do not a probability structure on (3.3) which allows us to
exist in the ordinary calculus sense, but must be explicitly determine quantities such as
reinterpreted via the calculus generated by the
P{a <~x(t)<~ b}, E{x(t)}, E{x2(t)}, etc.
generalized stochastic integral [12], a concept
introduced by K. ITO [13].
In this section we treat stochastic systems whose Naturally, we do not have to be quite so explicit
coefficient processes are not Gaussian white noise. in order to generate well defined sample differential
We assume that the coefficient processes are well equations. Indeed, suppose that the coefficient
defined with well behaved sample properties so process in (3.1) is a Gaussian process with continu-
that the system equations can be analysed by the ous sample functions (with probability one) on
ordinary rules of calculus. Such equations can be ( - o o , ~ ) . We are assured that for each sample
considered to be a collection of ordinary determi- function of the Gaussian process (with probability
istic differential equations upon which has been one) we can write the solution as,
induced a probability measure via the coefficient
process.
A very simple example illustrating this follows.
X(t)=X oexp -
L d to
r
z)dz .
_.1
] (3.5)
In (3.5), x(t) is a well defined random variable whose
Example 3. I
statistical properties are easily determined from
Consider the first order differential equation
the fact that the mean and covariance of the
f-process is known and
7_Z + f ( t ) x = O. (3.1)
dt

We shall assume that f(t) is a coefficient drawn


f ' f(T)dz
to

from the collection is a Gaussian random variable.


In this case, the explicit value of the sample
_.. 2 ( t - - n) solution remains unknown, until it is observed.
F= fn(') ; Jn(t)=(1/n6)+(t_n)2, This represents the type of stochastic system one
actually meets in nature. The Gaussian white noise
n = 1, 2 . . . . , tel-0, oo]} . (3.2) case is a mathematical abstraction. We hasten to
add, however, that when the coefficient process can
be replaced by the Gaussian white noise, one often
Therefore, for each J~,('), we shall have a well has more machinery with which to analyse the
defined differential equation with an easily obtain- solution process. However, the replacing of a
able solution. Indeed for to = 0, we have the collect- Gaussian coefficient process by the Gaussian white
ion of solutions, noise is a non-trivial matter that must be given
careful consideration. A fundamental discussion
.. (1/n6)+n 2 of these points may be found in [14].
X={x.(.) ; x.( t)= xo ~
(1/n°)+(t-n)"
., Returning to the question of stability, ROSEN-
BLOOM [15, 16], whose studies apparently come at
the beginning of the extensive studies on the topic
n = 1, 2, 3 . . . . te(0, oo)}. (3.3)
in the USA, considered systems of the form (3.1)
100 F. KOZIN

or cascades of such systems with stationary Gaus- One omission, as a condition, in their theorems
sian parameter processes. He was interested in the is that the indicated moments must exist. This
stability properties of the moments of such systems is in no way guaranteed for the general stochastic
(Definition IIIm). As we have stated above, the system (3.7). Indeed, the reader can easily verify
moments can be obtained explicitly for (3.5), w i t h f that for the simple first order system
gaussian, so that the asymptotic properties can be
studied, explicitly. (dx/dt) = b2x, (3.8)
Thus for the first order equation
where h is a constant, gaussian random variable
(dx/dt)+[a+f(t)]x=O, a > 0 , (3.6) with zero mean and variance a 2, all moments will
cease to exist for t > [1/(2o2)].
he shows that the first order equilibrium solution BERTRAM and SARACHIK applied their results
possesses stability of the first mean if and only if to systems of the type considered by ROSENBEOOM.
a < nso, where So is the value of the spectial density Thus, for
function pi(co), at o)=0.
However, a few years later BERTRAMand SARA- (dx/dt) = A(t)x , (3.9)
cmK [10], motivated by the Lyapunov second
method, took a comprehensive look at the general where x is an n-vector and A is the diagonal matrix
question of stability for stochastic systems. They
were, apparently, first in this country to approach
al(t)
the problem by Lyapunov techniques.
Typical of the results they obtained are theorems a2(t)
of the type one finds in the Lyapunov method
literature.
A(t) = , (3.10)
Thus for the general system of the form

(dx/dt)=f[x, t, y(t)], (3.7)


a.(O
where x, f are n-vectors, f is continuous, satisfies
a Lipschiz condition,J(o, t, y ) - 0 , and the y-process
they use the Lyapunov flmction
possesses well behaved sample functions so that
one can make sense of the sample equations as
V(x)=xrx (3.11)
ordinary differential equations, they obtain the
following theorems.
to establish the conditions
Theorem (Bertram and Sarachik)
If there exists a Lyapunov function V(x, t)
defined over the state space, which satisfies the t
E ai(t)ex p
k dr,, f' ai(u)du t)
<0 (3.12)
conditions
for all t >__to, i = 1, 2 . . . . n,
(a) V(o, t)=O,
which guarantee asymptotic stability in the large
(b) V(x, t) is continuous in x and t and the first of the first mean.
derivatives with respect to x, t exist, These results repeat those of ROSENBLOOM, for
ai, gaussian, but are obtained by more general
(c) V(x, t)~.~l]x Hfor some ~>0,
techniques. They also considered systems of the
(d) E{(d/dt) V (x(t), t)} <~0, form (3.9) for the case that
then the equilibrium solution possesses stability
A(t)=Ak, tk<t<tk+ 1, k=O, 1,2 . . . . . (3.13)
of the first mean.

7heorem (Bertram and Sarachik) For this piecewise constant coefficient case, the
If a Lyapunov function V(x, t) satisfies (a) (b) (c) solution may be written as
above as well as, k
x(t) = q)k(t -- t,) 1-I qh- l(ti- ti_ 1)x(to)
i=1
(d') E{d/dtV[x(t), t]} < -g(jlx(t)ll),
for tk<t<tk~l, (3.14)
where g(0)= 0 and g(llxll)is an increasing function, where
then the equilibrium solution possesses asymptotic
stability of the first mean in the large. q)k(t _ tk ) = eaW- t~). (3.15)
A survey of stability of stochastic systems 101

In case the A k are independent on successive inter- Hence, the expectation can be evaluated without
vals, then the negative definiteness of direct integration of the stochastic differential
equations. Naturally, V must possess the required
E{q~(t--tk)(AfQ+QAk)q~k(t--tk)} (3.16) derivatives.
The general theorems in [4] are similar to those
on each interval for a positive definite Q, is sufficient of [10]. Moreover, KATS and KRASOVSKII also
to guarantee global asymptotic stability of the treated Definition IV stability, exponential sta-
first mean. bility. The following theorem gives their necessary
Although BERTRAM and SARACmK did not and sufficient conditions.
establish results for any essentially new systems,
they must be credited with showing how Lyapunov Theorem (Kats and Krasovskii)
techniques can be applied to determining stochastic The equilibrium solution possesses exponential
stability properties. These are essentially the same stability of the second mean if and only if there
kinds of theorems that must be applied to any exists a Lyapunov function V(x, t, y) that satisfies,
stochastic system except that further assumptions for some constants cl, c2, c3,
must be put upon the nature of the properties
of the stochastic system in order to extend the (a) c, llxll c211xtlN, 0<e,<c=
results to explicit systems. We shall see in the
next section how much further these ideas can be (b) dg{Vlx'y't}<-c3llxl[
dt
e3>0. (3.19)
carried in order to obtain explicit results.
Independently, and almost simultaneously, KATS
Restricting their attention to systems of the form,
and KROSOVSKII[4], in the USSR, also published an
investigation of this type (involving Lyapunov
dx
function ideas). They also considered stability --=A(y)x, (3.20)
dt
as defined in Definitions IIIp, III m and IV, for
systems of the form (3.7). However, unlike BERTRAM
they establish that if the equilibrium solution
and SARACmK, they were interested in specific
possesses asymptotic stability of the mean, then
properties for the y-process. They assumed that
for any positive definite form ~o(x, y) there exists a
the y-process is a stationary Markov process with a
unique form (Lyapunov fnnction) v(x, y) for which
finite number of states {)q, . . . , Y,}. The pro-
bability of transition from state y~ to state Yi
in time At was assumed to satisfy dE{v/x, y}_ o(x, y) (3.21)
dt
pfj(At) = ~ijAt + o(At), i =~j,
Thus by evaluating (3.18) for the form
c~ii constant, i, j = l . . . r . (3.17)

This guarantees that the {x, y} process is Markov v(x, y i ) = ~ b~(yflx 2


i=1
and allows one to obtain a specific formula for
the quantity they obtain

dE{V(x, t, y)} (a~i(YflX l + . . . + a~.(yj)x.) aV~x? fl


dt 5=1

+ i ajk[V(X, Yk)--V(X, Yi)] = --~O(X, yfl.


where V is a Lyapunov function for the stochastic k~j
system (3.7), which is basic in the theorems of [4] as (3.22)
well as [10]. Indeed, from Markov process theory
[17], one has, denoting the conditional mean of They are then able to obtain conditions on the
V with x(t)=x, y(t)=)) by E{VIx, Yi, t}, it follows coefficients of v by equating them with the coeffi-
that cients of the positive definite form o) via equation
(3.22).
Whereas KATS and KRASOVSKII applied tech-
dE{Vlx' Yfi}-OV
tOt ~-,~=~
.= ~xfi(x't'
. niques to the case that the coefficient processes are
Markov, with finite states, they did not directly
use the fact that they are piecewise constant.
+ ~ ~j,[V(x, t, y,) In his 1961 Ph.D. thesis [5] BHARUCHAconcentrated
kC;j
on the piecewise constant properties for linear
--V(x, t, Yi)]" (3.18) systems to obtain a number of interesting results.
102 F. KOZIN

The fact that the systems are linear allowed him Theorem ( Bharucha)
to use direct methods of solution, similar to that Let q~ktq denote the ith Kronecker product of the
applied by BERTRAMand SARACHIK. random fundamental matrix with itself. Let the
He studied equations of the form matrices be independent, and identically distributed
as in (3.23a), then the equilibrium solution possesses
dx/dt = A(t)x stability of the ith moments if and only if no eigen-
values of E{qStq} lie outside the unit circle and
where A ( t ) = A k on tk_l<_t<t~, k = l , 2 . . . . , x is any eigenvalue on the unit circle is simple. The
an n-vector, A(t) an n x n matrix. equilibrium solution possesses asymptotic stability
He assumed two stochastic structures for the of the ith moments if and only if all eigenvalues
coefficient process. lie inside the unit circle.
BHARUCHA also shows that the asymptotic
(a) {(tk--tk_OAk} is a sequence of independent
stability will imply exponential stability. Moreover,
identically distributed random matrices.
if exponential stability holds for some even value
(b) {(tk--tk_OAk} is a finite state Markov of i, then the sample systems possess asymptotically
chain of random matrices. (3.23) stable equilibrium solutions with probability one.
Thus, we see that an asymptotic moment property
Note that the time intervals {tk--tk_l} are also yields sample stability properties. Hence, as we
allowed to be random variables. indicated in Section II, under certain conditions
BHARUCHA'S thesis was motivated by, and is an moment stability will yield desirable sample pro-
extension of, results of BERGEN [17]. Primarily perties. This point was noticed by BERGEN [18]
interested in asymptotic stability of the mean as and before him, KALMAN [20]. A general statement
well as exponential stability of the mean, BHARUCHA was made by KOZIN [7], which does not require
applies the device of Kronecker products of ma- conditions such as the independence of the co-
trices first used by BELLMAN[19]. efficients on successive intervals. For the Markov
The Kronecker product of the m x n and p xq case, BItARUCHAobtains a similar set of hypotheses.
matrices A, B denoted by A x B, is defined as the However, the matrix E{C~kt~l. • • 4htq} becomes the
direct product, product.
31 [ y,Q,[]k- 1 r , a 1To,
AItB Ax,B where

Y1--=x1 q- . . . -bXm, "]


AxB= (3.24) ai=Prob{@ 1 =x,},
Qi = P .... x I , m ,. (3.28)
A,,tB A,,,B P,,,, = (Po'),
Pij = Prob{~. = Xjl~ k_ 1 = xi}
which is an mp x nq matrix. A similar definition
holds for vectors. The conditions are on the matrices Yr,] Q/ij.
For the study of the second moments of the The eigenvalue hypotheses for this matrix and the
solution vectors of (3.22), with initial condition conclusions are the same as the theorem above.
xo, we denote the direct product of Xk=X(t k)
with itself as, Example 3.2 (Bharucha)
Consider the system (3.22), where t k - t,_ 1 = T,
Xkt 21= Xk X Xk . (3.25) k = 1 , 2 . . . and the Ak'S are independently, identic-
ally distributed as
It easily follows that

Ii::1
Xkt2] =(I)k[2l(l)k- 1121 • • • ( I ) 1 [ 2 ] X o [ 2 ] , (3.26)
, with probability p, a ~ 0
where
Ok = e.4k(tk-tk - 1)
Ak = (3.29)
the fundamental matrix as in (3.15).
Formulae similar to (3.26) hold for the ith Kro-
necker products as well. Typical of the results
obtained by BrIARUCHAis given in the next theorem. L::1 , with probability 1 - p ,
b~>0.
A survey of stability of stochastic systems 103

Thus, states, where the sojourn times are independently


distributed. PALMERapplied Markov chain-theory
ezTB with probability p
to determine conditions sufficient to guarantee
$= (3.30)
ahnost sure asymptotic stability of the equilibrium
ebTB with probability 1 -p ,
solution. In this study he applied the ideas of
where Markov chain-theory as well as the piecewise con-
I- -1 stancy rather significantly. Thus, he was able to
1 T extend the work of KATSand KRASOVSKII as well as
B=
BHARUCHA,as they relate to linear systems with
0 1’
I I Markov coefficients.
MOROZEN[22] has also treated the piecewise
We note that the system solutions possess a positive constant case for general conditions on the co-
probability for exponentially increasing in any efficient process. SOEDAet al. [23] has considered
interval (tk- I, tk). piecewise constant coefficient systems similar to
Thus, those in [IO].
We must pause a moment and ask ourselves
E(@& = [pe*“r+(l --p)ezbr]&, (3.31)
what it is that characterizes much of the work
where, described above. The basic feature of all of the
systems treated explicitly by the investigators cited
1 T T T2
above is that they can be solved in closed form.
0 1 0 T Although the results of BERTRAM and SARACHIK are
(3.32) general, they do not show how to obtain the neces-
sary expectations. The results of KATS and
0 0 0 1 KRASOVSKII allow one to obtain equations for the
needed expectations but involve finite state, or
The characteristic equation of Bc21is simply piecewise constant coefficients. ROSENBLOOM,
BERGEN,BHARUCHA,PALMERall require explicit
(l-11)4=0. (3.33) integration. For the continuous parameter case,
ROSENBLOOM was, essentially, only able to study
Therefore, unity is an eigenvalue of multiplicity 4. first order systems. In order to break out into the
In order to guarantee that the eigenvalues of (3.31) study of higher order linear systems, piecewise
all lie inside the unit circle, we must have constancy was chosen by these researchers.
e2bT - 1 It is quite obvious that one would like to have
e2bT _ e2aT
<p<l. (3.34) results which apply to higher order continuous
parameter systems without being restricted only
which guarantees asymptotic moment stability, to that class of systems that can be explicitly
as well as exponential stability of the moments, solved in closed form.
and hence almost sure sample stability. A result along these lines was presented in [24]
BHARUCHA’Sresults allow one to treat a wide for linear stochastic systems. The results in this
class of systems by this direct method. But, it is work were established for systems of the form
clear that it can be quite cumbersome due to the
Kronecker products. For example, in the case (dx/dt) = [A +F(t)]x (3.35)
of the first mean norm, an 11thorder equation with
an m state Markov coefficient process requires where s is an n-vector, A is an IZxn stability
evaluation of the eigenvalues of an mn xmn matrix, F(t) is an n x12 matrix whose non-zero
matrix. coefficients, fkj(t), are stationary, ergodic processes
Comparing, [4] and [S], it is interesting to note with almost surely continuous sample functions.
that KATS and KRASOVSKIIdo not directly make The main theorem follows.
use of the fact that the coefficient processes are Theorem (Kozin)
piecewise constant on random intervals for their Let the coefficient matrix F(t) possess the prop-
treatment of Markov coefficients, and BHARUCHA erties stated above. Let the expectation, E(/F(t)[l}
does not fully exploit the results of Markov chain- exist. Jt follows that there is a constant CIdepending
theory. upon the stability matrix A such that
The ergodic properties of Markov chains are
applied by PALMER [21] to the study of linear
systems of the form (3.22), where the coefficient
matrices constitute a stationary positive recurrent implies that the equilibrium solution of the system
Markov chain with finite or discrete infinity of (3.35) is asymptotically stable with probability one.
104 F. KOZlN

The proof of the theorem is simply based upon which is truly a dramatic extension of the region
the Gronwall-Bellman lemma of differential of stability over [24-27]. Moreover, the fact that
equation theory [3] as well as the fact that for the variance of the coefficient process can approach
stationary ergodic processes, infinity as the damping coefficient approaches
infinity answers a conjecture raised by MEHR and
lim t LIF( )IIo, =E{llF(s)ll}, (3.36) WANG [25].
l~o# o We mention in passing, that GRAY [31] has
with probability one. recently studied linear ergodic coefficient systems
Hence, we have a sufficient condition that applies directing his attention to the frequency content of
to general linear systems with ergodic coefficients. the coefficient process in an attempt to study the
Such coefficient processes include, for example, narrow band case.
the important class of process generated by passing The reader might, at this point, have wondered
white noise through a linear filter. what implications the "deterministic stability"
The result above applies for any norm as dis- that we refer to, as almost sure sample stability,
cussed in MEHR and WANG [25]. However, the has for stability of the moments.
choice of norm significantly determines the strength We can easily demonstrate what can happen
of the sufficient condition obtained. For example, by Example 3.1. It is obvious from (3.3) that all
CAUGHEY and GRAY [26], using Lyapunov tech- solutions approach zero asymptotically so that
niques, extended the results above to obtain sharper for this linear stochastic system, the equilibrium
sufficient conditions that can be obtained through solution is almost surely asymptotically stable with
application of the Gronwall-Bellman approxi- probability one. But, upon computing, say, the
mation. One very interesting result in [26] is the first moment E{lx(t)[}, one can show that for t = n
application of these ideas to second order non- k-x
linear differential equations of the form E{lx(n)[ } > [Xoln~(1 + n8). (3.40)

d2[~+2zoodX+[~o2 +f(t)]x +o(x)=O (3.37)


dt z dt Hence, the first moment and all higher moments
where f(t) is a sample from a stationary ergodic are unbounded, even though all samples approach
process with continuous sample function, with zero! Another example of this phenomenon will be
probability one, and g is a continuous non-linear illustrated in the next section.
function satisfying certain symmetry as well as We shall close this section by briefly discussing
monotone properties. some results on a rather significant problem of
ARIARATNAM [27] obtains sharper sufficient con- stochastic stability theory. In his rather interesting
ditions than in [24], MOROZAN [28] extends [24] papers, BOGDANOFF [32, 33] has performed both
to the case that the A-matrix is random, and WANG analytical as well as experimental investigations
[29] extends the results of [24] to the study of the of the problem of stabilizing an unstable determin-
stability properties of distributed parameter systems. istic system through the introduction of random
The most recent extension of the results on parametric excitation, where the resulting random
ergodic coefficient stochastic systems is due to system possesses an equilibrium solution that is
INFANTE [30]. Applying results of the extremal almost surely stable. The random parametric
properties of the eigenvalues of pencils of quad- excitation that BOGDANOFF studies is of the form
ratic forms, he is able to obtain the following N
theorem. ~(t)=2 ~ eke/(°'kt+*~l, (3.41)
k= -N
Theorem (lnfante)
If, with the usual conditions on A and F(t), where the ~p's are independent, uniformly distrib-
for some positive definite matrix B and some e > 0, uted random variables on [0, 2n]. This excitation
E{2max(A~"+ Fr(t) + B[A + F(t)]B- ~)} ~< - e, is a finite term Rice type noise having a discrete
spectrum. He establishes that if 2 is small enough,
then the equilibrium solution of the system (3.35) and look+ O)j[k,~ are large enough and E{[~(t)] 2} >
is almost surely asymptotically stable. gl, then the random system
Applying this theorem to the system
(dxddO=x2
(dxl/dt) = x2 -1
(dxz/dt) = - 2~x2- [1 +f(t)]xx, ? (3.38) dx2-(+
l ~)snix-l2CXd't _ (3.42)

which was studied in [24-27], Infante obtains


possesses almost surely stable equilibrium solution.
E{f2(t)} < 4( 2 - e, (3.39) whereas the deterministic system obtained upon
A survey of stability of stochastic systems 105

setting the noise term equal to zero possesses The equation (4.4) obtains its rigorous meaning via
an unstable equilibrium. the stochastic integral equation
Bogdanoff mentions that it may not be possible
to stabilize, on an almost sure basis, with random
parametric excitations having continuous spectra. xt-x t =
(' m(x~, r)dz+
I' a(x, z)dB, (4.5)
o d to ,}to
However, this question still does remain somewhat
open. We shall discuss it somewhat more in the
Such equations are now generally referred to,
next section. in the literature, as stochastic equations of Ito type.
In order to guarantee the existence and unique-
IV. STOCHASTIC SYSTEMS: ness of the solution process, uniform Lipschitz
ITO EQUATIONS
conditions as well as growth conditions are placed
In this section we shall concern ourselves with
on the coefficients m and a. Under such conditions
systems of the type x is a vector Markov process, with continuous
(dx/dt)=m(x, t)+a(x, t)W, (4.1) sample functions [36]. Furthermore, the solution
process is associated with the differential operator.
where x is an n-vector, m is an n-vector, a an
n x n matrix and W is an n-vector of Gaussian L=½ ~ b,j(x, t ) ~ + ~ m,(x, t) , (4.6)
white noise components. These components may or i, j = l i=1
may not be correlated with one another. We
assume, m(O, t) - 0, a(0, t ) - 0. where (bii) = B = aa r.
The differential equation (4.1) makes no sense as The operator L is referred to as the differential
far as the ordinary rules of the calculus are con- generator of the process defined by (4.4) or (4.5).
cerned. The indicated derivative simply does not It may also be called the Backward diffusion
exist. This is due to the fact that the Gaussian operator to distinguish if from the, adjoint, forward
white noise is not a stochastic process. (It can, diffusion operator (See [17]).
however, be rigorously defined as a random There exists a differential calculus for processes of
distribution [34].) the type defined by (4.4). The calculus, introduced
The formal representation of the Gaussian by ITO [37], is defined via the following differential
white noise can be stated through the simple formula, for suitable twice continuously different-
formula iable functions G(t, x), as,
Wt = (dBt/ dt) , (4.2)
"G
dG=O-'-~Udt+ •~., O---~lxi +½ "7'~" a2G "b~j,ix, t)dt.
(See [35]-section 14 for a simple derivation), Ot i= 10xi ", "=1 axiOxi
where {Bt, te(O, oo)} is the Brownian motion pro- (4.7)
cess. That is, it is the zero mean Gaussian process
Or, equivalently, from (4.6).
with stationary independent increments for which

(a) E{B,Bs}=min(s, t) dG=OGdt+LG+ ~ 0~x~G(r,it"x, t)dBj. (4.8)


cqt i,j=l
(b) P { B o = 0 } = I . (4.3)
The reader may note that (4.7) differs from the
It has been known since the monumental works differential formulae of the ordinary calculus
of Wiener (this process if often referred to as the only in the addition of the last sum on the right
Wiener process) that almost all the sample functions hand side. On an intuitive basis this last sum is easy
of the process are continuous functions that are to justify. The reader may recall that the differential
no where differentiable. Hence, the relation (4.2) of ordinary calculus is a function of order dt,
is also a fiction. The basic reason for this pathology i.e. 0(dt). Therefore, to obtain the differential of
is that the sample increments (AB,) 2 are 0(At) and
G one must include all terms of 0(dt). However,
not 0[(At)2]. as we mentioned above (dB) 2 is 0(dr). Hence, those
Fortunately, a rigorous interpretation of the terms which involve second powers of dB must be
equation (4.1) has been given in the fundamental
included in the differential. These terms will appear
works of ITO [13], by means of the generalized in the second derivative terms of the Taylor expan-
stochastic integral. Since ITo's work, the different- sion, which can be shown to produce the second
ial equation (4.1), employing the formal relation
sumation on the right hand side of (4.7).
(4.2), is usually written as the equation in differen-
The solutions to Ito equations can be obtained
tials, in a number of simple cases as the following first
dx=m(x, t)dt+a(x, t)dB (4.4) order equations illustrate.
106 F. KOZtN

Example 4.1 where a j, tr,. are constants, the component Brownian


The stochastic differential equations motions may be correlated and at least one of the
a i is non-zero.
(a) d x = a x d t + a x d B SAMUELS did not make use of the fact that the
state vector is a Markov process. Instead he
(b) dx =-}x I/3dt + xZ/3dB (4.9)
studied the system directly by successive approx-
imations starting with the solution to the determin-
possess the solutions
istic constant coefficient system in (4.12). Naturally,
(a) x t = Xo e[a- (a2/2)it + aB# such an approach allowed him to study linear
systems with general coefficient processes on the
(b) Xt=(X~/3+½Bt) 3 (4.10) basis of successive approximations to the moments.
Upon applying the independence properties of the
respectively. White noise, tantamount to the property of inde-
The solutions (4.10) can easily be established by pendent increments of the Brownian motion, he
determining their differentials via formula (4.7). obtained a set of sufficient conditions in terms of
If we were confronted by (4.10) without an the constants in (4.12) that imply asymptotic bound-
explanation of the concept of stochastic equations edness of the second moments, E{x2(t)}, which he
with Gaussian white noise coefficients, we would has referred to as mean square stability. In fact,
probably state, on the basis of our early training in the conditions guarantee more than boundedness.
calculus, that the functions (4.10) are the solutions They imply that the second moment decays expo-
to nentially to zero, i.e. Definition IV of section II.
In [43] SAMUELS brought up the question of
(a) dx = [a - (aZ/2)]xdt + axdB stabilizing a deterministic, linear, unstable system
(b) dx=x2/3dB. (4.11) by the introduction of Gaussian white noise
excitation in the system parameters. The question
It is easy to identify the differences between (4.9) is the same as treated by BOODANOFF[32], as we have
and (4.11). already seen. The question continues to be of
The difference in the nature of the calculus significance to this day. Apparently, by stabilizing,
required for processes generated by stochastic SAMUELS meant that the second moments of the
differential equations with Gaussian white noise stochastic system would be asymptotically bounded.
coefficients led to a few misunderstandings during SAMUELS indicated that a system of the form
the early stages of applications of such models.
But, one can now say that the relationship between dxl = x2dt
such systems and those treated in section III of this
paper are becoming widely understood. Again, dx2 = [/3x2 - coZoxx ] d t - ax2dB , /3>0 (4.13)
we refer the reader to [14], as well as STRATONOWCH
[38], ASfROM [39], CAUGHEY and GRAY [40], and possesses stable second moments, whereas ttle
CLARKE [41] for different views on the topic. deterministic system (obtained by setting the noise
Having had this very brief introduction into the term equal to zero in (4.13)) does not possess a
interesting concepts involved when dealing with stable equilibrium solution.
lto equations, we can now proceed to our discussion However, certain numerical errors in [43],
of the various stability studies that have been made pointed out by CAUGHEY [44], did cast some doubt
relative to them. on the possibility of moment stabilization by
One of the first stability studies of nth order white noise. For the case treated in [32, 33], ahnost
differential equations of Ito type was directed sure sample stabilization was established both
to linear systems by SAMtJELS[8,42,43]. He investi- theoretically as well as experimentally. In that
gated the asymptotic properties of the second case, however, the spectrum of the noise coefficients
moments of systems of the form is discrete. The question still remains somewhat
open for stabilization on an almost sure sample
dx 1 = x2dt basis by processes with continuous spectra. The
question of stabilization of the moments was also
dx 2= xadt
discussed by LIEBOWITZ[45], as well as BOGDANOFF
and KOzIN [46]. As we shall see in [48, 49], a
definitive answer to stabilization of the moments,
as treated in [43--46], has been given.
We can easily show by the formal techniques
used in [4@ that the equilibrium solution of (4.13)
dx,=- ~ [aidt+aidBJx l, (4.12)
l=l does not possess asymptotic stability of the second
A survey of stability of stochastic systems 107

moments. The Fokker-Planek equation for (4.13) We have asymptotic stability (exponential) of the
can easily be shown to be [46], nth moments if and only if
OP
. . = _ 7 _ _ - -OP _pOX2P+wgxl
. , OP q 0 2 O~2 x 2 p , if2
a <--z-(n - n 2) . (4.19)
tJt----X2tTXl OX2 OX2 2 OX2 2
(4.14)
where P is the conditional probability density Hence, for n = 2, a < - a 2 is sufficient, for n = 4,
function for the state variables (xl, x2) of (4.13). a < - 6a 2 is sufficient, etc. It is obvious from (4.19)
From (4.14), one can formally obtain equations that one can have asymptotic stability of the second
for the second moments by multiplying (4.14) re- moments yet the fourth moment will be unbounded
spectively by x 2, xtx2, and x22 then integrating the and so on for higher moments. But, an even more
equation over the real plane R 2. The equations are, interesting property of the solutions (4.10a) hold.
It is known (See e.g. [47], p. 560) that the sample
/h2,o=2ml, l functions of the Brownian motion do not grow
faster than x/(t log t), as t approaches infinity,
rhl, 1= -CO2oml. o + flml,2 + mo, 2 with probability one. Hence, the asymptotic
rho, 2 = -2¢oo2ml, 1 +(2fl + 02)mo, 2 (4.15) properties of the samples (4.10a) are determined,
with probability one, by the sign of the coefficient
where (a-a2/2). They approach zero exponentially with
m,,l=E{x~x~}, i , j = O , 1, 2, i + j = 2 . (4.16) probability one if and only if a<a2/2. It is im-
portant to note that for - a 2 < a < 0 2 / 2 almost all
The stability of the moments is determined by of the samples approach zero, so that the equi-
the real parts of the roots of the characteristic librium solution is almost surely asymptotically
equation, stable, even globally. However, the second and
higher moments will increase exponentially to
~3_ (3/~ + o2),t 2 + 3(20 + a2)~, infinity. Hence, we again see that stability of the
+2¢92(2fl+o2)=0. (4.17) moments is not necessary for almost sure sample
stability of the system. Of course, if the moments
In order that the solutions of (4.15) asymptotic- are stable for (4.9a) so will the equilibrium solution
ally approach zero, the Routh-Hurwitz conditions of the system be almost surely stable.
require the coefficients of (4.17) to be positive. This The general question of bounded moments for
is obviously not so, since (3fl + 02) > 0. Hence, the lto equations has recently been studied by ZAKAI
second moments cannot be stable. [51]. He shows, for example, that under certain
During discussions of stability in [43] and [44], asymptotic conditions on the coefficients f, G if A
the phase "stabilizing the system" was often used is a stability matrix, the system
when in fact the moments of the system were under
consideration. In [44] it is mentioned that "mean dx = Axdt +f(x)dt + G(x)dB (4.20)
square stability is a necessary--but not sufficient--
condition for stability of a system." In fact, how- possesses bounded moments. That is for each
ever, almost exactly the opposite is the case. We m = 1, 2 . . . . there exists a positive constant K,, for
have seen by, Example 3.1, that the sample equa- which the upper limit satisfies
tions possess asymptotically stable equilibrium
solutions with probability one, yet all moments lim E{llx(t, Xo, toil2} ~< Kin. (4.21)
become unbounded. Furthermore, under certain 1--~O0

conditions, exponential stability of the second


moments imply almost sure sample stability. In two recent works [48, 49] NEVELSON and
This in no way invalidates the various remarks in KHAS'MINSKIIhave performed comprehensive stud-
[44], but it does serve to illustrate the somewhat ies of stability of the moments (Definition IlIm),
non-intuititive relationships that exist "between exponential stability of the moments IV, and their
moment stability and almost sure sample stability. relationship to stability in probability (Definition
To illustrate this point for Ito equations, let us Ip). The pattern of[48] follows that of[4] somewhat,
consider the first order equation (4.9a) which poss- except that the systems being studied in [48] are
esses the sample solution (4.10a). The nth moments Ito equations. NEVELSONand KHAS'MINSKIIapply
of (4.10a) for, say, a deterministic initial condition Lyapunov second method techniques as in [4]
Xo are easily shown to be and are able to obtain specific results in terms of
conditions on Lyapunov functions. Naturally,
E{x~}=xn°exp['/a-L~ -~)nt+--~--t].
0"2~ °2n2 "] (4.18) the differential generator (4.6) will be basic here.
For example, they prove,
108 F. KozIN

lhcorcm ( Ncvelson and Khas'minskii) for this type of stability is given by tile R o u t h -
Let the function V(t, .v) be twice continuously Hurwitz determinantal conditions of the constant
differentiable in x and continuously differentiable coefficients a i of the deterministic part of (4.12),
in t. Let V satisfy', for some q , c2, c3, c4, >0, that is

Al=al> 0 A2 = /1 a3 > 0
' a2 '
LV(t, x)<~ -- C
3llxll,,,
m

~3V " ,,, 1 a t c;l 3 a 5 0


I< i = 1. . . . n, (4.22)
1 a2 a4 0

0 a t a 3 0
then the equilibrium solution of (4.4) with differen- • . . A,, = >0 (4.25)
tial generator (4.6) possesses exponential stability 0 1 a2 0
of the mth moments.
In a certain sense this theorem does not offer
any basically new techniques. Indeed, the ideas 0 0 0 a n
involved can be, essentially, found in [4] and [10].
It is the application to the system (4.4) that is new. plus the condition
They also establish that, with the additional
hypothesis of homogeniety in x of order m, the A,>A, (4.26)
conditions above are necessary and sufficient to
guarantee the exponential stability of the ruth where A is A, modified by adding one new row
moments for systems (4.4) where m and a are linear of terms which are prescribed linear functions of
inx. the constants a~j=a~a/. These results put, in a
As a result of one of the theorems of their paper concise form, the conditions applied in [43-46].
[48], they establish that if the deterministic system We pause for a moment to remark that a large
body of moment stability results exist in the liter-
dx = M ( t)xdt , (4.23) ature, based upon some approximate engineering
type of analysis, for non-linear systems with Gaus-
does not possess an asymptotically stable equil- sian white noise coefficients (i.e. System 4.4).
ibrium solution, then the equilibrium solution of A collection of such papers by KUZNETSOV,
STRATONOVlCH and TmHONOV, that were originally
dx = M ( t ) x d t + N(t, x ) d B , (4.24) published during the period 1953-1961, has recently
been put under a single cover [52]. They made
where N is linear in x, cannot possess asymptotic extensive use of the linearization techniques of
stability of the mth moments, for any m. KRYLOV and BOGOLIUBOV, as well as the method
This very significant result appears to answer, of averaging of BOGOLIUBOV and MITROPOLSKII
in a definite fashion, the discussions that took [53], in order to achieve approximate moment
place in [42-46]• For linear systems with Gaussian stability results. KOLOMIETS [54] has published a
white noise coefficients, one now knows that if number of articles recently on this topic, the latest
the moments decay asymptotically, the determin- being a paper in conjunction with MITROPOLSKn
istic system obtained by setting the noise terms [55], which makes use of an averaging idea of
equal to zero possesses an asymptotically stable KHAS'MINSKII[56] for Ito equations with time varying
equilibrium solution• coefficients. It is clear that the method of averaging
We wish to stress that this does not rule out as applied to stochastic systems is related to [32].
SAMUELS' basic intuitive feeling, enhanced by However, the complete connection has yet to be
experiments [43], that an unstable deterministic developed. A very interesting investigation of the
system can be stabilized by introducing noise into method of averaging as applied to stochastic
the system parameters. It merely supports this systems can be found in [57]. SAWM~AGI,StJN~tARA
writer's view that we should look at almost sure and SOEDA [58], apply the method of statistical
sample stability of the system instead of moment linearization, introduced by a number of investig-
properties. ators in this country as well as abroad, (See e.g.
In [49] NEVELSONand KHAS'MINSKII devote their BOOTOrq [59]) to investigate the stability of the
attention to the specific linear system of the form moments of non-linear system via its statistically
(4•12). Based upon results obtained in [48], they linearized counterpart.
present explicit conditions to guarantee asymptotic The reader may begin to realize, even from our
stability of the second moments. In particular they very brief survey of the literature on stability of
show that the necessary and sufficient condition stochastic differential equations of Ito type, that
A survey of stability of stochastic systems 109

there is a large body of rather specific results on This work is, apparently, the first general stab-
non-trivial stochastic systems. More so than for ility result that takes into account the sample
systems of the type treated in section III. To a behavior on the entire half line (to, oo). As such
certain extent this is due to the fact that there is it is of practical significance. The following example
more machinery available with which to study the illustrates the power of these results.
asymptotic properties of the solution processes
and its moments.
Example 4.2 (Khas' minskii)
Consider the first order system
A number of the results we have described so far
were obtained by investigations related to the
dx = m(x)dt + tr(x)dB , (4.28)
Lyapunov second method. The salient feature here
that allows the analogous ideas to go through for
where the differential generator is
Ito equations, is the fact that for any twice contin-
uously differentiable function fix, t), the expected
L_ : ,O .a2(x) 02
value of the derivative of this function along the -mtX)~x+ 2 0--£7
trajectories of the process defined by (4.4) with initial
condition (x, t) is given via the differential gener- and
ator (4.6) as Lv(x, t). Therefore, the average deriv-
ative properties of Lyapunov functions can be ex- mCx)=mox+o(lxl) as x ~ 0
plicitly obtained in theorems requiring such condit- aZ(x)=a2x2+o(x2) as x--O. (4.29)
ions. As we know, this is not generally possible for
the non-Markov systems discussed in section III. KHAS'MINSKII establishes that the equilibrium
There are many more basic properties of Markov solution is asymptotically stable in probability
process theory than those that we have come in if mo<ao2/2 and unstable in probability if do2~
contact with up to now in our survey. Potential 2 > m o.
theory, first passage times and Martingale process We have already seen that these conditions hold
theory have been applied in the last few years to on an almost sure sample basis for the linear
obtain very significant stability properties for the system (4.9a).
equilibrium solutions of Ito equations. To prove stability, Khas'minskii uses V(x)= Ixl',
Perhaps the first use of these significant and where O<7<l-2mo/ao 2. To prove instability
fundamental results of Markov process theory to the he applies W(x)=-loglx[.
stability question was applied by KHAS'MXNSKII[60] In the study of higher order systems, he demon-
to stability in probability of the equilibrium solution strates, as an example, that the stable deterministic
of (4.4). In particular, KHAS'MINSKII considered system
systems of the form (4.4) with time independent co-
efficients, where the second derivative term of the (dxl/dt)= x2
differential generator (4.6) is a non-degenerate ellip- (dx2/dt) = -x~ (4.30)
tic operator of the vector x. That is, there exists a
continuous function m(x)> 0, for x ~ 0 , such that becomes unstable in probability when one appends
for all real 2i noise as, for example,
b,j(x)2,2j>~m(x) ~ 22 . (4.27)
i,j=l 1=1 dxl = x2dt -t-o'(xl, x2)dB~
This guarantees that every component equation dx 2 = - x 1dt + a(xl, x2)dB 2 . (4.31)
in the system (4.4) possesses noise coefficients.
Applying first passage time ideas of Markov For the one dimensional example, if mo<0
process theory, as well as properties of elliptic (i.e. the deterministic system is stable in the linear
operators, such as the maximum principle of approximation), then adding the random term
potential theory, Khas'minskii established that will not effect the stability properties. Even more
Theorem (Khas' minskii) interesting is that the first order system
The equilibrium solution of the time invariant
coefficient system of type (4.4), where bij satisfies (dx/dO = re(x) (4.32)
(4.27) is stable in probability (Definition Ip) if and
only if there exists a continuous, non-negative which possesses an unstable equilibrium solution,
function V(x) which vanishes only at x = 0 for which can be made stable in probability by adding a
LV(x)<. O. noise term so long as too<go2~2. Thus, we see
He also obtained results for instability in terms that for the first order system introduction of noise
of the usual type of condition for Lyapunov's can stabilize the equilibrium solution in a significant
second method. sense. KHAS'MINSKngoes on to establish that if the
110 F. KOZIN

deterministic linear approximate system is stable A non-negative super-martingale is, essentially a


then for systems of order less than or equal to two, stochastic process { V,, teT}, for which E{I V,I } < 0%
introduction of noise will not affect the stability. and for any t l < t 2 < .. • < t , + l ,
However, if the system is of dimension greater than
two, then introduction of noise with sufficiently E{V,,+I[I,;,, . . . . V,,}~< Vt,' (4.34)
large intensity will destabilize the equilibrium
solution in probability. with probability one.
These discussions, by KHAS'MINSKII,of the relat- A basic property of non-negative super-martin-
ionship of stability of the linear approximations gales is that for any 2 >/0,
and how their properties relate to stability in
probability of the non-linear system were quite an
P f Sup 1,~<2}~< E{V°} (4.35)
advance in the literature. Where previous invest- t o_<t<< )"
igators had studied the implications of moment
stability of the linearized system, KHAS'MINSKII'S The reader can convince himself, with a little
investigations were directed towards a stability study, that if for a suitable chosen Lyapunov
property that looks at the sample behavior on the function V(x) for xtthe solution to (4.4) and for any
entire half line. 2>0, if E{Vo} can be made arbitrarily small
In the work already cited above, [48], NEVEL'SON (Vo- V(xo)), will reflect the initial conditions) then
and KHAS'MINSKII proved theorems concerning the probability on the left of (4.35) can be made
the implication of stability in probability for the arbitrarily small as well. This is equivalent to
equilibrium solution of (4.4) from exponential stability in probability as defined in this survey,
stability of the nth moments of the linear first and in the various works cited above. (Note:
approximation equation. In a more recent work There is a slight difference of terminologies in the
KHAS'MINSKll [50] has extended these results some- literature on this point. What we refer to as stab-
what to show that if the linearized system satisfies ility in probability, Kushner calls stability with
probability one. We can only ask the reader to
SupP~ Sup IIx(., s,
s<O [u>s+T
(4.33) decide for himself which of the terminologies he
prefers to use here. It can be easily established [6],
that Ip and I .... stabilities are identical for linear
as T ~ or, then the equilibrium solution is stable in systems).
probability. KUSHNER extended the idea of BuoY to the
It is interesting to note that in [61] it has been continuous parameter systems and thus the range
shown that exponential stability of the second of applicability of the stochastic Lyapunov function
moments of (4.4) implies that the sample solutions results of KHAS'MINSKII [60]. A major extension
approach zero asymptotically with probability one. here is that the differential generator (4.6) is not
Even more, it is established that the samples, required to be an elliptic operator, so that a much
themselves, decay exponentially, with probability wider class of systems can now be analysed.
one. Hence, sufficient conditions, for almost sure Typical of the many results of KUSHNER is the
exponential stability for the equilibrium solutions following theorem, stated in simple terms [64].
for linear systems is established. For completeness
we remark that KHAS'MINSKII [62] has recently Theorem (Kushner)
produced a necessary and sufficient condition for If V(x) is a positive definite function twice
the almost sure sample stability of the equilibrium continuously differentiable for which V(0)=0 and
solution for linear systems. In this work he does LV(x)<~O in the region Q = {x; V(x)<~q}, then for
present an example of a second order unstable e<q, it follows that for the solution process
deterministic system that can be stabilized by the {x,, te(0, ~)} to (4.4), with differential generator L
introduction of Gaussian white noise parameters,
on an almost sure sample basis. Unfortunately,
as the description of the mathematical tools in-
Pt Sup
I.O_<t< ~
V(xt)~e}<<.
l/(xo)=p,
/3
(4.36)
volved are beyond scope of this survey, we can
only refer the interested reader to this reference.
and xt converges to the set of x for which L V(x)= 0,
The results of KHAS'MINSKIIin [60] were extended
with probability at least equal to p.
by KUSHNER [63, 64] as a result of a paper of
Of the many interesting examples constructed by
Btrcv [65]. BucY, devoting his studies to discrete
KUSHNER, he shows that for the system
parameter versions of (4.4), recognized that Lya-
punov functions of the solution process of (4.4), as
dxl --x2dt
functions of t, are non-negative super-martingales.
(See [12], Chapter 7). dx2 = - g(xl)dt- ax2dt- x2trdB (4.37)
A survey o f stability o f stochastic systems 111

where [15] A. ROSENBI.DOM: Analysis of linear systems with ran-


domly time-varying parameters. Prec. Syrup. Inf. Nets.,
Vol. III, Poly. Inst. Brooklyn, p. 145 (1954).
S S--* O0 as X'--}OO, [16] A. ROSENnLOOMet aL: Analysis of linear systems with
randomly varying inputs and parameters. IRE Conven-
sg(s)>O for s # O a n d g(o)=O, tion Record, p.t. 4, p. 106 (1955).
[17] A. T. BHARUCHA-REID: Elements of the Theory of
the L y a p u n o v function Marker Processes and Their Applications, McGraw-
Hill, New York (1960).
[18] A. R. BERGEN: Stability of systems with randomly time-
V(x) = x 2 +
2~xlO(s)ds (4.38)
varying parameters. IRE Trans. AC-5, 265 (1960).
[19] R. BELLMAN: Limit theorems for non-commutative
0
operations. Duke Math. d. 21,456 (1960).
[20] R. E. KALMAN: Control of randomly varying linear
suffices to p r o v e t h a t the state vector x, a p p r o a c h e s dynamical systems. Amer. Math. Soc., Prec. Syrup.
zero with p r o b a b i l i t y one if - 2a + tr 2 < 0. AppL Math. 13, p. 287 (1962).
[21] J. T. PALMER: Sufficient conditions for almost sure
An especially r e a d a b l e a c c o u n t o f the construct- Lyapunov stability for a class of linear systems. Ph.D.
ion o f stochastic L y a p u n o v functions is given in Thesis, School of Aero, Astro, Eng. Sci., Purdue Univ.
(1966).
[64]. The collection o f KUSHNER'S w o r k on the
[22] T. MOROZAN: Stability of some linear stochastic
topic a p p e a r s in his b o o k [11]. systems. J. diff. Eqs 3, 153 (1967).
[23] T. SOEDAand K. UMEDA'- Stability of randomly time-
V, CONCLUSION varying control systems by the second method of
Lyapunov. Bull. Fac. Engng, Tokushima Univ., Japan 3,
W e have a t t e m p t e d to present a survey o f some 43 (1966).
m a j o r p r o b l e m s , results a n d trends in the subject [24] F. KOZIN: On almost sure stability of linear systems
with random coefficients. M.LT. J. Math. Phys. 43,
o f stability o f stochastic systems. A l t h o u g h o u r
59 (1963).
survey a n d reference list is r e a s o n a b l y c o m p r e h e n - [25] C. B. MEnR and P. K. C. WANG: Discussion of
sive, it by no means exhausts the m a n y fine papers reference (26). ASMEJ. Appl. Mech. 33, 234 (1966).
[26] T. K. CAUGHEYand A. H. GRAY, JR. : On the almost
t h a t have a p p e a r e d in the j o u r n a l s on this topic. sure stability of linear dynamic systems with stochastic
W e can only h o p e that those readers w h o are inter- coefficients. ASMEJ. Appl. Mech. 32, 365 (1965).
ested in the topic will c o n d u c t their own literature [27] S. T. ARIARATNAM: Dynamic stability of a column
under random loading. Dynamic Stability of Structures,
survey a n d uncover the m a n y fine ideas t h a t have Proc. Int. Conf. p. 267. Pergamon, Oxford (1967).
a p p e a r e d in this fascinating subject. [28] T. MOROZAN: Stability of linear systems with random
parameters. J. Diff. Eqs 3, 170 (1967).
REFERENCES [29] P. K. C. WANG: On the almost sure stability of linear
stochastic distributed parameter dynamical systems.
[1] A. A. ANDRONOW,L. S. PONTRYAGINand A. A. WITT." ASMEJ. Appl. Mech. 33, 182 (1966).
On the statistical investigation of dynamical systems.
d. exp. Theor. Phys. 3, 165 (1933). [30] E. F. INFANTE: On the stability of some linear
[2] M. C. WAN~ and G. E. UHLENBECK: On the theory non autonamous random systems. ASME paper
of the Brownian motion II, Rev. Nod. Phys. 17, 323 67-WA/APM-25 (1967).
(1945). [31] A. H. GRAY, JR.: Frequency dependent almost sure
[3] L. CESARt: Asymptotic Behavior and Stability Problems stability conditions for a parametrically excited random
in Ordinary Differential Equations. Springer, Berlin vibrational system. ASME paper 67-APM-9 (1967).
(1959). [32] J. L. BOGDANOEE: Influence on the behavior of a linear
[4] I. I. KATS and N. N. KRASOVSrOI: On the stability of dynamical system of some imposed motion of small
systems with random parameters. Prkil. Met. Mek. 24, amplitude..Jr. Acoust. Soc. Am. 34, 1055 (1962).
809 (1960). [33] J. L. BOGDANOFFand S. J. CITRON: On the stabilization
[5] B. H. BHARUCHA: On the stability of randomly varying of the inverted pendulum. Proc. 9th Midwestern
systems. Ph.D. Thesis Dept. of Elec. Eng., Univ. Calif., Mechanics Conf. (1965).
Berkeley, July (1961). [34] K. ITO: Stationary random distribution. Mere. Coll.
[6] F. KOZlN: Stability of stochastic systems. Paper 3A, Sci. Univ. Kyoto, Japan 28, 209 (1953).
3rd IFAC Congr., London (1966), [35] A. M. YAGLOM: An Introduction to the Theory oj
[7] F. Kozrs: On relations between moment properties Stationary Random Functions. Prentice-Hall, New
and almost sure Lyapunov stability for linear stochastic Jersey (1962).
systems. J. Math. Anal AppL 10, 342 (1965). [36] K. ITO: Lectureson Stochastic Processes. Tata Institute,
[8] J. C. SAMUELS: On the mean square stability of random Bombay (1961).
linear systems Trans. IRE, PGIT-5, Special Supplement, [37] K. ITO: On a formula concerning stochastic differen-
p. 248 (1959). tials. Nagoya Math. Jr., Japan 3, 55 (1951).
[9] W. M. WON/qAM: Lyapunov criteria for weak stochastic [38] R. L. STRATONOVICH: A new representation for stoch-
stability. J. Diff. Eqs 5, 195 (1966). astic integrals and equations. SIAM J. Contr. 4, 362
[10] J. E. BERTRAmand P. E. SARACmK: Stability of circuits
(1966).
with randomly time-varying parameters, Trans. 1RE,
PGIT-5, Special Supplement, p. 260 (1959). [39] K. J. ASaXOM: On a first order stochastic differential
[11] H. KUSrlNER: Stochastic Stability and Control Aca- equation. Int. J. Contr. 1, 301 (1965).
demic Press, New York (1967). [40] A. H. GRAY,JR. and T. K. CAUOHEY: A controversy in
[12] J. L. Doon: Stochastic Processes. Wiley, New York problems involving random parametric excitation.
(1953). MIT J. Math. Phys. 44, 288 (1965).
[13] K. Iro: On stochastic differential equations. Mem. [41] J. M. C. CLARK: The representation of non-linear
Am. Math. Soe. No. 4 (1951). stochastic systems with applications to filtering. Ph.D.
[14] E. WONG and M. ZAK~: On the relation between Thesis Univ. London, Imp. Coll. (1966).
ordinary and stochastic differential equations. Int. J. [42] J. C. SAMUELSand A. C. ER1NGEN: On stochastic
Engng Sci. 3, 213 (1965). linear systems. MITJ. Math. Phys. 38, 83 (1959).
112 F. KOZIN

[43] J. C. SAMUELS: On the stability of random systems and [61] F. KOZlN : On almost sure asymptotic sample properties
the stabilization of deterministic systems with random of diffusion processes defined by stochastic differential
noise. J. Acoust Soc. Am. 32, 594 (1960). equations. J. Math. Kyoto Univ. 4, 575 (1965).
[44] T. K. CAUGHEY: Comments on " O n the stability [62] R. Z. KHAS'MINSKn: Necessary and sufficient conditions
random systems". J. Acoust Soc. Am. 32, 1356 (1960). for the asymptotic stability of linear stochastic systems.
[45] M. A. LEIBOWITZ: Statistical behavior of linear of Th. Prob. Appls 1, 144 (1967).
linear systems with randomly varying parameters. [63] H. J. KUSHNER : On the stability of stochastic dynamical
J. Math. Phys. 4, 852 (1963). systems. Proc. Natl. Acad. Sci. 53, 8 (1967).
[46] J. L. BOGDANOFFand F. KozIN: Moments of the output [64] H. J. KUSHNER: On the construction of stochastic
of linear random systems. J. Acoust Soc. Am. 34, 1063 Lyapunov functions. Trans. IEEE AC-10, 477 (1965).
(1962). [65] R. S. BucY: Stability and positive supermartingales.
[47] M. LOEVE: Probability Theory, 2nd Ed. Van Nostrand, J. Diff. Eqs 1, 151 (1965).
Princeton (1963).
[48] M. B. NEVEL'SON and R. Z. KHAS'MINSKn; On the R6sum6---Le fut principal de ce manuscrit est de donner au
stability of stochastic systems (in Russian). Prog. Trans. lecteur int6ress6 une certaine comprdhension du sujet de
Inf. 2, 76 (1966). stabilitd des syst~mes stochastiques. Nous prdsentons
[49] M. B. NEVEL'SON and R. Z. KHAS'MINSKn: Stability of certaines des id6es fondamentales, de m6me qu'une revue
a linear system with random disturbances of its para- des rdsultats qui ont paru dans la literature. Des thdor~mes
meters (in English). Prikl. Mat. Mek. 30, 487 (1966). sont prdsentds lorsqu'il y a lieu de le farre mais aucune
[50] R. Z. KHAS'MINSKn: First approximation stability in the ddmonstration n'est donn6e. I1 est esp6r6 que le lecteur
case o f stochastic systems (in Russian). Prikl. Mat. Mek. intdress6 se reportera aux sources originales pour le develop-
31, 1021 (1967). pement complet des id6es mises en cause. Nous nous
[51] M. ZAKAl: On the ultimate boundedness of moments limitons aux modules ~t param6tres continus et distinguons
associated with solutions of stochastic differential entre les syst6mes comprenant le bruit blanc gaussien dans
equations. Technion Haifa, Faculty of Elec. Eng. Pub. les dquations du syst6me et les syst6mes qui ne comprennent
No. 58 (1966). pas le bruit blanc gaussien. Une bibliographre raison-
[52] P. I. KUZNETSOV et aL : Non-Linear Transformations o f nablement repr6sentative est incluse.
Stochastic Processes. Pergamon Press, Oxford (1965).
[53] N. N. BOGOLIUBOVand Y. A. MITROPOLSKII." Asymp-
totic Methods in the Theory o f Non-Linear Oscillation. Zusammenfassung--Dem interessierten Leser sollen vor allem
G o r d o n & Breach, New York (1961). einige Hinweise fiber die Stabilit/it stochastischer Systeme
[54] V. G. KOLOMIETS: The parametric effect o f a random gegeben werden. Dargestellt werden einege Grundgedanken
force on a non-linear oscillating system (in Russian). und ein Oberblick tiber die in der Literatur niedergelegten
Ukr. Math. J. 15, 199 (1963). Resultate gegeben. W o e s angemessen ist, werden Theoreme
[55] Y. A. MITROPOLSKIIand V. G. KOLOMIETS: Application angefiihrt, aber ohne Beweise. Wir beschr~inkten uns auf
of the averaging principle to the investigation of the Modelle mit kontinuierlichen Parametem und unterscheiden
influence of random effects on oscillatory systems (in zwischen solchen Systemen, welche GauB'sches weifles Raus-
Russian). Mathematical Physics (Edited by Y. A. chert in den Systemgleichungen einschliel]en und solchen,
MITROPOLSKn), Kiev, p. 146 (1967). bei denen dies nicht der Fall ist. Eine annehmbar repr/isenta-
[56] R. Z. KHAS'MINSKII: Principle of averaging for para- tive Bibliographie ist beigefiigt.
bolic and elliptical differential equations and Markov
processes with small diffusion. Th. Prob. Appls 8, 1 PeamMe---FZlaBHOR 3aj~a,ie~ nacTom-Re~t pyronucH ~Bn.qeTc~t
(1963). HaMeperme ~aTb 3aHHTepeconamloMy m,iTaTemo neKoTopoe
[57] R. Z. KHAS'MINSKII: On stochastic processes defined by noPmTHe 0 Bonpoce yCTOI~Rra~OCT14CTOXaCTh~eCK~XC14CTeM.
differential equations with a small parameter (in MbI npHBOaaM neKoTopl~ie 143 OCHOBrlhlXm~e~ a TaroKe i4
English). Th. Prob. Appls 11, 211 (1966). 0630p pe3ynbTaTOB ony6zmKoBanI-ihlX B ni4Teparype.
[58] Y. SAWARAGIet al. : Statistical studies on the response TeopeM.bl npase~xerua Korea a eTOM l~vleeTC~lHeo6xon.uMO-
of non-linear time varyiant control systems subjected CTb HOI~I O/IHOkI3g0Ka3aTe311~B ne ~aao. Mbi Ha~eeMc~/~ITO
to a suddenly applied stationary Gaussian random 3ala~iTepecoBarI141,i~ tlHTaTeJlb o6paT14TI,C~I K HpHN~THBHbIM
input. Mere. Fac. Eng. Kyoto Univ. Japan 24, 465 (1962). 14CTO~IHHKaM ~_R nOYlHoro pa3B14T~l 3aTpOHyTI,IX ~ e ~ .
[59] R. C. BOOTEN: Non-linear control systems with random MM orpalmmmaeMc~ Mo~IeBaMHC Henpepbmm,IMa napaMeT-
inputs. Trans. IRE, PGCT-1, p. 9 (1959). paM14 140TZlH'taeMC~CTeMbInKmoqalonme B CBO14ypaB14erma
[60] R. Z. KHAS'MINSKII: On the stability of the trajectories raycconcKuii 6em, iit LUyM OT C14CTeM He BK.rlro~IaIoHIBX
of Markov processes (in English). PrikL Mat. Mek. 26, r a y c c o ~ z ~ 6era,lit myM. B crar1,IO Bx.rno,teHa pa3yMao-
npe]~cTaBl~l'~ribHa~ 6a6aaorpa~mq.
1554 (1962).

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy