Types and types of random processes. Random processes and their main statistical characteristics

In practice, there are such random variables that continuously change in the course of one experiment depending on time or some other arguments. For example, the radar tracking error does not remain constant, but changes continuously with time. At each moment it is random, but its value at different times when escorted by one aircraft is different. Other examples are: lead angle when continuously aiming at a moving target; radio range finder error during continuous measurement of varying range; deviation of the trajectory of the guided projectile from the theoretical one in the process of control or homing; fluctuation (shot and thermal) noise in radio devices and so on. Such random variables are called random functions. A characteristic feature of such functions is that it is not possible to specify their form before the experiment. A random function and a random variable relate to each other in the same way as a function and a constant value considered in mathematical analysis.

Definition 1. A random function is a function that for each outcome of experience associates some numerical function, that is, a mapping of the space Ω into some set of functions (Figure 1).

Definition 2. A random function is a function that, as a result of experience, can take one or another specific form, it is not known in advance which one.


The specific form that a random function takes as a result of experience is called implementation random function.

Due to the unpredictability of behavior, it is not possible to depict a random function in a general form on a graph. One can only write down its specific form - that is, its implementation, obtained as a result of the experiment. Random functions, like random variables, are usually denoted by capital letters of the Latin alphabet X(t), Y(t), Z(t), and their possible implementations, respectively x(t), y(t), z(t). Random Function Argument t in the general case, it can be an arbitrary (not random) independent variable or a set of independent variables.

The random function is called random process if the random function argument is time. If the argument of a random function is discrete, then it is called random sequence. For example, a sequence of random variables is a random function of an integer argument. In Figure 2, as an example, implementations of a random function are shown X(t): x1(t), x2(t), … , xn(t), which are continuous functions of time. Such functions are used, for example, for the macroscopic description of fluctuation noise.

Random functions are encountered in any case when we are dealing with a continuously operating system (a system of measurement, control, guidance, regulation), when analyzing the accuracy of the system, we have to take into account the presence of random influences (fields); air temperature in different layers of the atmosphere is considered as a random function of height H; the position of the center of mass of the rocket (its vertical coordinate z in the shooting plane) is a random function of its horizontal coordinate x. This situation in each experiment (startup) with the same pickup data is always somewhat different and differs from the theoretically calculated one.

Consider some random function X(t). Suppose that n independent experiments were performed on it, as a result of which n implementations were obtained (Figure 3) x1(t), x2(t), … , xn(t). Each implementation is obviously a regular (non-random) function. Thus, as a result of each experiment, the random function X(t) turns into normal nonrandom function.

Let's fix some value of the argument t. Let's go at a distance

t = t0 a straight line parallel to the y-axis (Figure 3). This line will intersect realizations at some points.

Definition. The set of intersection points of realizations of a random function with a straight line t = t0 is called a section of a random function.

Obviously, section represents some random variable , the possible values ​​of which are the ordinates of the points of intersection of the line t = t0 with implementations xi(t) (i= ).

Thus, a random function combines the features of a random variable and a function. If you fix the value of the argument, it turns into an ordinary random variable; as a result of each experience, it turns into an ordinary (non-random) function.

For example, if we draw two sections t = t1 And t = t2, then there are two random variables X(t1) And X(t2), which together form a system of two random variables.

2 Laws of distribution

A random function of a continuously changing argument on any arbitrarily small interval of its change is equivalent to an infinite, uncountable set of random variables that cannot even be renumbered. Therefore, for a random function it is impossible to determine the distribution law in the usual way, as for ordinary random variables and random vectors. To study random functions, an approach based on fixing one or more argument values ​​is used. t and the study of the resulting random variables, that is, random functions are studied in separate sections corresponding to different values ​​of the argument t.


Fixing one value t1 argument t, consider a random variable X1= X(t1). For this random variable, one can define the distribution law in the usual way, for example, the distribution function F1(x1, t1), probability density f1(x1, t1). These laws are called one-dimensional distribution laws of a random function X ( t ). Their peculiarity is that they depend not only on the possible value x1 random function X(t) at t = t1, but also on how the value is chosen t1 argument t, that is, the laws of distribution of a random variable X1= X(t1) depend on the argument t1 as a parameter.

Definition. Function F1(x1, t1) = P(X(t1)< x1) is called the one-dimensional probability distribution function of the random function, or

F1(x, t) = P(X(t)< x) . (1)

Definition. If the distribution function F1(x1, t1) = P(X(t1)< x1) differentiable with respect to x1 then this derivative is called the one-dimensional probability distribution density (Figure 4), or

. (2)

The one-dimensional distribution density of a random function has the same properties as the distribution density of a random variable. In particular: 1) f1 (x, t) 0 ;

2) https://pandia.ru/text/78/405/images/image009_73.gif" width="449" height="242">

One-dimensional distribution laws do not describe a completely random function, since they do not take into account the dependencies between the values ​​of a random function at different points in time.

Since for a fixed value of the argument t random function turns into an ordinary random variable, then when fixing n values ​​of the argument, we get the set n random variables X(t1), X(t2), …, X(tn), that is, a system of random variables. Therefore, setting a one-dimensional distribution density f1(x, t) random function X(t) with an arbitrary value of the argument t similarly to setting the densities of individual quantities included in the system. Full description system of random variables is the joint law of their distribution. Therefore, a more complete characterization of the random function X(t) is the n-dimensional distribution density of the system, that is, the function fn(x1, x2, … , xn, t1, t2, … , tn).

In practice, finding n- the dimensional law of distribution of a random function, as a rule, causes great difficulties, therefore, they are usually limited to a two-dimensional distribution law, which characterizes the probabilistic relationship between pairs of values X ( t1 ) And X ( t2 ).

Definition. The two-dimensional distribution density of a random function X(t) is called the joint distribution density of its values X(t1) And X(t2) for two arbitrary values t1 And t2 argument t.

f2(x1, x2, t1, t2)= (3)

https://pandia.ru/text/78/405/images/image012_54.gif" width="227" height="49">. (5)

The normalization condition for the two-dimensional distribution density has the form

. (6)

3 Characteristics of a random process:

mathematical expectation and variance

When solving practical problems, in most cases, obtaining and using multidimensional densities to describe a random function is associated with cumbersome mathematical transformations. In this regard, in the study of a random function, the simplest probabilistic characteristics are most often used, similar to the numerical characteristics of random variables (mathematical expectation, variance) and the rules of action with these characteristics are established.

In contrast to the numerical characteristics of random variables, which are constant numbers , the characteristics of the random function are non-random functions his arguments.

Consider a random function X(t) at a fixed t. In the section we have the usual random variable. Obviously, in the general case, the mathematical expectation depends on t, that is, it is a function t:

. (7)

Definition. The mathematical expectation of a random function X(t) a non-random function is called https://pandia.ru/text/78/405/images/image016_47.gif" width="383" height="219">

To calculate the mathematical expectation of a random function, it is enough to know its one-dimensional distribution density

The mathematical expectation is also called non-random component random function X(t), while the difference

(9)

called fluctuation part random function or centered random function.

Definition. The variance of a random function X(t) is called a non-random function, the value of which for each t is equal to the variance of the corresponding section of the random function.

It follows from the definition that

The variance of a random function for each characterizes the spread of possible implementations of a random function relative to the average, in other words, the “degree of randomness” of a random function (Figure 6).

Lecture 18

The concept of a random process. Characteristics of random processes.

Stationary random processes.

Random processes with independent increments

Definition. random process is called the family of random variables given on the probability space
, Where is the current time. A bunch of parameter values called domain of definition of a random process, and the set possible values
the space of values ​​of a random process.

A random process, unlike a deterministic process, cannot be predicted in advance. As examples of random processes, one can consider the Brownian motion of particles, the operation of telephone exchanges, interference in radio engineering systems, etc.

If the scope random process represents a finite or countable set of time readings, then we say that
random process with discrete time or random sequence(chain), and if the domain of definition is a continuum, then
called random process with continuous time.

In the event that space values ​​of a random process is a finite or countable set, then the random process is called discrete. If the space values ​​of a random process is a continuum, then a random process is called continuous.

actual function
for some fixed value called implementation or trajectory of a random process. Thus, a random process is a collection of all possible implementations, that is,
, where the implementation indicator
may belong to a countable set of real numbers or to a continuum. A deterministic process has a single implementation, described by a given function
.

At a fixed
we get the usual random variable
, which is called random process cross section at the time .

Univariate distribution function random process
at a fixed
called a function

,
.

This function specifies the probability of a set of trajectories that, for a fixed
pass below the point
.

At
it follows from the definition (5.1.1) of the one-dimensional distribution function that the equality specifies the probability of the set of trajectories passing through the “gates” between the points
And
.

Bivariate distribution function random process
at fixed And called a function

,
.

This function specifies the probability of multiple trajectories that simultaneously pass below the points
And
.

Similarly -dimensional distribution function random process
at fixed
is defined by the equality

for all
from
.

If this function is differentiable enough times, then - dimensional joint probability density random process
has the form

.

The distribution function or probability density describes the random process itself more fully, the more . These functions take into account the connection, although between any, but only fixed sections of this process. A random process is considered given if the set of all its - dimensional laws of distribution or - dimensional probability densities for any . In this case, the distribution function must satisfy Kolmogorov's symmetry and consistency conditions. The symmetry condition is that
is a symmetric function for all pairs
,
, in the sense that, for example,

The consistency condition means that

that is - dimensional distribution law of random process
determines all distribution laws of lower dimension.

Let us consider various characteristics of random processes.

Definition. mathematical expectation or the mean value of the random process
called a function

,

Where
is the one-dimensional probability density of the random process. Geometrically, mathematical expectation corresponds to a certain curve, around which the trajectories of a random process are grouped.

Definition. The variance of a random process
called a function

Thus, the mathematical expectation and variance of a random process
depend on the one-dimensional probability density and are non-random functions of time . The variance of a random process characterizes the degree of scatter of trajectories relative to its average value
. The greater the dispersion, the greater the spread of trajectories. If the variance is zero, then all trajectories of the random process
coincide with the mathematical expectation
, and the process itself is deterministic.

Definition. correlation function
random process
is defined by the equality

Where
is the two-dimensional probability density of the random process.

correlation function
characterizes the degree of connection between the ordinates of the random process
for two points in time And . Moreover, the larger the correlation function, the smoother are the trajectories of the random process
, and vice versa.

The correlation function has the following properties.

10 . Symmetry: ,
.

2 0 . ,
.

These properties follow from the corresponding properties of the covariance of a random variable.

The theory that studies random processes based on the mathematical expectation and the correlation function is called correlation theory. With the help of methods of correlation theory, mainly linear systems of automatic regulation and control are investigated.

Definition. random process
,
, is called stationary in the narrow sense, if the joint distribution of random variables

AND ,

the same and does not depend on , that is

From here to - dimensional probability density, the relation

Taking into account that in the case of a one-dimensional probability density, and assuming in this relation
, we have . From here, for a stationary random process, we find the following expression for the mathematical expectation:

.

Similarly, for a two-dimensional probability density, from the equality for
get . Therefore, the correlation function can be written as

Where
.

Thus, for stationary random processes in the narrow sense, the mathematical expectation is a constant value, and the correlation function depends only on the difference of the arguments, that is, since the correlation function is symmetric.

Definition. A random process with a constant mathematical expectation and a correlation function that depends only on the difference of the arguments is called random process, stationary in the broad sense. It is clear that a random process that is stationary in the narrow sense is also stationary in the broad sense. The converse assertion is not true in general.

The correlation function of a stationary random process has the following properties.

1 0 .
, that is, the function
- even.

20 . fair inequality
.

thirty . For the variance of a stationary random process
fair ratio.

Let
,
, is a stationary random process, continuous in time , with mathematical expectation
and correlation function
.

Definition. The function denoted
and determined by the ratio

,

called spectral density.

If the spectral density is known
, then using the Fourier transform, we can find the correlation function

.

The last two equalities are called Wiener-Khinchin formulas.

It is obvious that for the existence of the inverse Fourier transform it is sufficient that the integral exists
, that is, absolute integrability on the interval
correlation function
.

It can be shown that the spectral density
stationary random process is an even function, that is,
.

Because
is an even function, then

,

.

From these formulas and the definition of the correlation function
it follows that the variance of the stationary random process
is equal to

.

If a random process is a fluctuation of an electric current or voltage, then the variance of the random process as the average value of the square of the current or voltage is proportional to the average power of this process. Therefore, it follows from the last equality that the spectral density
in this case characterizes the power density per unit of circular frequency
.

In practice, instead of the spectral density
often used normalized spectral density
equal to

.

Then, as is easy to see, the so-called normalized correlation function and normalized spectral density
are related by direct and inverse Fourier transforms:

,
.

Assuming
and given that
, we have

.

Taking into account the parity of the spectral function, we obtain

,

that is, the total area bounded from below by the axis
and above the plot of the normalized spectral density, is equal to one.

Definition. random process
,
, is called incremental process, if for any
,
,
, random variables

,
, …,

independent.

In this case, for different pairs of random variables, the correlation function is equal to zero.

If the random variables are pairwise uncorrelated, then the random process
called process with uncorrelated or orthogonal increments.

Since the random variables are independent, they are uncorrelated (orthogonal). Thus, any process with independent increments is a process with orthogonal increments.

Let
is a random process with orthogonal increments. Then for
we get

because the random variables
And
orthogonal.

Similarly, when
we get that .

So the correlation function
random process with orthogonal increments has the property

Applying the Heaviside function
, the correlation function can be written as

Literature: [L.1], pp. 155-161

[L.2], pp. 406-416, 42-426

[L.3], pp. 80-81

Mathematical models of random signals and noise are random processes. A random process (SP) is a change in a random variable in time. Random processes include most of the processes occurring in radio engineering devices, as well as interference that accompanies the transmission of signals over communication channels. Random processes can be continuous(NSP), or discrete(DSP) depending on which random variable, continuous or discrete, will change in time. In what follows, the main focus will be on the NSP.

Before proceeding to the study of random processes, it is necessary to determine the ways of their representation. We will denote a random process by , and its specific implementation by . The random process can be represented either set (ensembles) of implementations, or one, but rather extended in time implementation. If we photograph several oscillograms of a random process and place the photographs one under the other, then the totality of these photographs will represent an ensemble of implementations (Fig. 5.3).

Here - the first, second, ..., k-th implementation of the process. If, however, the change in the random variable is displayed on the recorder tape over a sufficiently large time interval T, then the process will be represented by a single implementation (Fig. 5.3).

Like random variables, random processes are described by distribution laws and probabilistic (numerical) characteristics. Probabilistic characteristics can be obtained both by averaging the values ​​of a random process over an ensemble of implementations, and by averaging over one implementation.

Let the random process be represented by an ensemble of implementations (Fig. 5.3). If we choose an arbitrary point in time and fix the values ​​taken by the implementations at this point in time, then the totality of these values ​​forms a one-dimensional section of the SP

and is a random variable. As already emphasized above, the exhaustive characteristic of a random variable is the distribution function or one-dimensional probability density

.

Naturally, both , and , have all the properties of the distribution function and the probability distribution density discussed above.

The numerical characteristics in the section are determined in accordance with expressions (5.20), (5.22), (5.24) and (5.26). So, in particular, the mathematical expectation of the joint venture in the cross section is determined by the expression

and the variance is the expression

However, the laws of distribution and numerical characteristics only in the section are not enough to describe a random process that develops in time. Therefore, it is necessary to consider the second section (Fig. 5.3). In this case, the SP will already be described by two random variables and spaced apart by a time interval and be characterized by a two-dimensional distribution function and two-dimensional density , Where , . Obviously, if we introduce the third, fourth, etc. section, one can come to a multidimensional (N-dimensional) distribution function and, accordingly, to a multidimensional distribution density .

The most important characteristic of a random process is autocorrelation function(AKF)

which establishes the degree of statistical relationship between the values ​​of the SP at time points and

Representing the SP as an ensemble of realizations leads to the concept of process stationarity. The random process is stationary, if all initial and central moments do not depend on time, i.e.

, .

These are strict conditions, therefore, when they are met, the joint venture is considered hospital in the narrow sense.

In practice, the concept of stationarity is used in broad sense. A random process is stationary in a broad sense if its mathematical expectation and variance do not depend on time, i.e.:

and the autocorrelation function is determined only by the interval and does not depend on the choice on the time axis

In what follows, only random processes that are stationary in the broad sense will be considered.

It was noted above that a random process, in addition to being represented by an ensemble of realizations, can be represented by a single realization on the time interval T. Obviously, all characteristics of the process can be obtained by averaging the values ​​of the process over time.

The mathematical expectation of the SP when averaged over time is determined as follows:

. (5.46)

This implies the physical meaning: the mathematical expectation is the average value (constant component) of the process.

The SP dispersion is determined by the expression

and has the physical meaning of the average power of the variable component of the process.

Autocorrelation function when averaged over time

The random process is called ergodic, if its probabilistic characteristics obtained by averaging over the ensemble coincide with the probabilistic characteristics obtained by averaging over time of a single implementation from this ensemble. Ergodic processes are stationary.

The use of expressions (5.46), (5.47) and (5.48) requires, strictly speaking, the implementation of a random process of large (theoretically infinite) extent. When solving practical problems, the time interval is limited. In this case, most processes are considered approximately ergodic and the probabilistic characteristics are determined in accordance with the expressions

; (5.49)

;

Random processes that have no mathematical expectation are called centered. In what follows, and will mean the values ​​of centered stochastic processes. Then the expressions for the variance and autocorrelation function take the form

; (5.50)

We note the properties of the ACF of ergodic random processes:

– the autocorrelation function is a real function of the argument ,

– the autocorrelation function is an even function, i.e. ,

– with increasing ACF decreases (not necessarily monotonically) and tends to zero as

- ACF value at equal dispersion (average power) of the process

.

In practice, one often has to deal with two or more joint ventures. For example, a mixture of a random signal and interference is simultaneously received at the input of a radio receiver. The relationship between two random processes is established by cross correlation function(VKF). If and are two random processes characterized by realizations and , then the cross-correlation function is determined by the expression

The application of the general definitions given in the previous paragraph is illustrated below on several characteristic random processes.

Along with the designation of a random process by a symbol, the designation will be used in the same sense, which means a random function of time. As before, denotes the implementation of a random function

1. HARMONIC OSCILLATION WITH RANDOM AMPLITUDE

Let in the expression defining the signal

the frequency and the initial phase are deterministic and constant values, and the amplitude A is random, equiprobable in the range from 0 to the value (Fig. 4.2).

Let's find one-dimensional probability density for a fixed moment of time . The instantaneous value can be any in the range from 0 to and we will assume that . Hence,

Rice. 4.2. The set of harmonic oscillations with a random amplitude

Rice. 4.3. Probability density of harmonic oscillation with random amplitude

The graph of the function for a fixed value is shown in fig. 4.3.

Math expectation

Finally, dispersion

The stochastic process under consideration is non-stationary and non-ergodic.

2. HARMONIC OSCILLATION WITH A RANDOM PHASE

Let the amplitude and frequency of a harmonic signal be reliably known in advance, and the initial phase be a random variable that can take any value in the range from to with the same probability. This means that the probability density of the initial phase

Rice. 4.4. The set of harmonic oscillations with random phases

One of the implementations of a random process formed by a set of harmonic oscillations with random phases (Fig. 4.4) can be defined by the expression

(4.23)

The total phase of the oscillation is a random variable, equiprobable in the range from to . Hence,

Rice. 4.5. On the definition of the probability density of a harmonic oscillation with a random phase

Rice. 4.6. Probability density of a harmonic oscillation with a random phase

Let's find the one-dimensional probability density of the random process . Let's select the interval (Fig. 4.5) and determine the probability that during the measurement carried out in the time interval from to the instantaneous value of the signal will be in the interval. This probability can be written as , where is the desired probability density. It is obvious that the probability coincides with the probability that the random phase of the oscillations falls into one of the two hatched in Fig. 4.5 phase intervals. This last probability is Therefore,

whence the desired function

Thus, finally

The graph of this function is shown in Fig. 4.6.

It is essential that the one-dimensional probability density does not depend on the choice of time t, and the average over the set (see (2.271.7) in )

coincides with the time average

(This is true for any implementation of the random process under consideration.)

The correlation function in this case can be obtained by averaging the product over the set without resorting to the two-dimensional probability density [see. general expression (4.8)]. Substituting into (4.8)

and also taking into account that the first term is a deterministic quantity, and the second term is statistically averaged using a one-dimensional probability density [see. (4.22)] vanishes, we obtain

The same result is obtained when averaging the product over time for any implementation of the process.

The independence of the average value of and the correlation function from the position of the interval - on the time axis allows us to consider the process under consideration as stationary. The coincidence of the results of averaging over the set and time (for any realization) indicates the ergodicity of the process. Similarly, it is easy to show that a harmonic oscillation with a random amplitude and a random phase forms a stationary, but not an ergodic process (different implementations have different dispersion).

3. Gaussian random process

The normal (Gaussian) law of distribution of random variables is more common in nature than others. The normal process is especially typical for communication channel interference. It is very convenient for analysis. Therefore, random processes, the distribution of which does not differ too much from the normal one, are often replaced by a Gaussian process. The one-dimensional probability density of a normal process is given by

In this section, we consider a stationary and ergodic Gaussian process. Therefore, by one can mean, respectively, the constant component and the average power of the fluctuation component of one (sufficiently long) implementation of a random process.

Graphs of the probability density under the normal law for some values ​​are shown in fig. 4.7. The function is symmetrical with respect to the mean value. The more the less the maximum, and the curve becomes flatter [the area under the curve is equal to one for any values ​​of ].

The wide distribution of the normal distribution law in nature is explained by the fact that when summing a sufficiently large number of independent or weakly dependent random variables, the distribution of the sum is close to normal for any distribution of individual terms.

This proposition, formulated in 1901 by A. M. Lyapunov, was called the central limit theorem.

Clear physical examples of a random process with a normal distribution law are noises caused by the thermal motion of free electrons in the conductors of an electrical circuit as a shot effect in electronic devices (see § 7.3.).

Rice. 4.7. One-dimensional probability density of a normal distribution

Rice. 4.8. Random functions with the same distribution (normal), but with different frequency spectra

Not only noise and interference, but also useful signals, which are the sum of a large number of non-random elementary signals, for example, harmonic oscillations with a random phase or amplitude, can often be interpreted as Gaussian random processes.

On the basis of the function, one can find the relative residence time of the signal in a certain level interval, the ratio of the maximum values ​​to the root mean square (peak factor) and a number of other parameters of a random signal that are important for practice. Let us explain this using the example of one of the implementations of the Gaussian process shown in Fig. 4.8, and for This function of time corresponds to noise interference, the energy spectrum of which extends from zero frequency to some cutoff frequency. The probability that the value x(t) stays in the interval from a to b is determined by expression (4.1). Substituting (4.28) into this expression, for , we obtain



Share: