# Method of Lyapunov Functions

### Definition of the Lyapunov Function

A Lyapunov function is a scalar function defined on the phase space, which can be used to prove the stability of an equilibrium point. The Lyapunov function method is applied to study the stability of various differential equations and systems. Below, we restrict ourselves to the autonomous systems

${\mathbf{X’} = \mathbf{f}\left( \mathbf{X} \right)\;\;\text{or}\;\;}\kern-0.3pt {\frac{{d{x_i}}}{{dt}} = {f_i}\left( {{x_1},{x_2}, \ldots ,{x_n}} \right),\;\;}\kern-0.3pt {i = 1,2, \ldots ,n,}$

with the zero equilibrium $$\mathbf{X} \equiv \mathbf{0}.$$

We suppose that we are given a continuously differentiable function

$V\left( \mathbf{X} \right) = V\left( {{x_1},{x_2}, \ldots ,{x_n}} \right)$

in a neighborhood $$U$$ of the origin. Let $$V\left( \mathbf{X} \right) \gt 0$$ for all $$\mathbf{X} \in U\backslash \left\{ \mathbf{0} \right\},$$ and $$V\left( \mathbf{0} \right) = 0$$ in the origin. For example, these are functions of the form

${V\left( {{x_1},{x_2}} \right) = ax_1^2 + bx_2^2,\;\;}\kern-0.3pt {V\left( {{x_1},{x_2}} \right) = ax_1^2 + bx_2^4,\;\;}\kern-0.3pt {a,b \gt 0.}$

We find the total derivative of the function $$V\left( \mathbf{X} \right)$$ with respect to time $$t:$$

${\frac{{dV}}{{dt}} = \frac{{\partial V}}{{\partial {x_1}}}\frac{{d{x_1}}}{{dt}} }+{ \frac{{\partial V}}{{\partial {x_2}}}\frac{{d{x_2}}}{{dt}} + \cdots } + {\frac{{\partial V}}{{\partial {x_n}}}\frac{{d{x_n}}}{{dt}}.}$

This expression can be written as a scalar (dot) product of two vectors:

${\frac{{dV}}{{dt}} = \left( {\text{grad}\,V,\frac{{d\mathbf{X}}}{{dt}}} \right),\;\;}\kern-0.3pt{\text{where}\;\;}\kern-0.3pt {{\text{grad}\,V }= \left( {\frac{{\partial V}}{{\partial {x_1}}},\frac{{\partial V}}{{\partial {x_2}}}, \ldots ,\frac{{\partial V}}{{\partial {x_n}}}} \right),\;\;}\kern0pt {\frac{{d\mathbf{X}}}{{dt}} = \left( {\frac{{d{x_1}}}{{dt}},\frac{{d{x_2}}}{{dt}}, \ldots ,\frac{{d{x_n}}}{{dt}}} \right).}$

Here, the first vector is the gradient of $$V\left( \mathbf{X} \right),$$ i.e. it’s always directed toward the greatest increase in $$V\left( \mathbf{X} \right).$$ Typically, the function $$V\left( \mathbf{X} \right)$$ increases with the distance from the origin, i.e. provided $$\left| \mathbf{X} \right| \to \infty .$$ The second vector in the scalar product is the velocity vector. At any point, it is tangent to the phase trajectory.

Consider the case when the derivative of $$V\left( \mathbf{X} \right)$$ in a neighborhood $$U$$ of the origin is negative:

${\frac{{dV}}{{dt}} \text{ = }}\kern0pt{\left( {\text{grad}\,V,\frac{{d\mathbf{X}}}{{dt}}} \right) }\lt{ 0.}$

This means that the angle $$\varphi$$ between the gradient vector and the velocity vector is greater than $$90^\circ.$$ For a function of two variables, it is shown schematically in Figures $$1-2.$$

Obviously, if the derivative $$\large\frac{{dV}}{{dt}}\normalsize$$ along a phase trajectory is everywhere negative, then the trajectory tends to the origin, i.e. the system is stable. Otherwise, when the derivative $$\large\frac{{dV}}{{dt}}\normalsize$$ is positive, the trajectory moves away from the origin, i.e. the system is unstable.

We now turn to the strict formulation.

Let a function $$V\left( \mathbf{X} \right)$$ be continuously differentiable in a neighborhood $$U$$ of the origin. The function $$V\left( \mathbf{X} \right)$$ is called the Lyapunov function for an autonomous system

$\mathbf{X’} = \mathbf{f}\left( \mathbf{X} \right),$

if the following conditions are met:

1. $$V\left( \mathbf{X} \right) \gt 0$$ for all $$\mathbf{X} \in U\backslash \left\{ \mathbf{0} \right\}$$;
2. $$V\left( \mathbf{0} \right) = 0$$;
3. $${\large\frac{{dV}}{{dt}}\normalsize} \le 0$$ for all $$\mathbf{X} \in U$$.

### Theorem on stability in the sense of Lyapunov.

If in a neighborhood $$U$$ of the zero solution $$\mathbf{X} = \mathbf{0}$$ of an autonomous system there is a Lyapunov function $$V\left( \mathbf{X} \right),$$ then the equilibrium point $$\mathbf{X} = \mathbf{0}$$ of the system is Lyapunov stable.

### Theorem on asymptotic stability.

If in a neighborhood $$U$$ of the zero solution $$\mathbf{X} = \mathbf{0}$$ of an autonomous system there is a Lyapunov function $$V\left( \mathbf{X} \right)$$ with a negative definite derivative $${\large\frac{{dV}}{{dt}}\normalsize} \lt 0$$ for all $$\mathbf{X} \in U\backslash \left\{ \mathbf{0} \right\},$$ then the equilibrium point $$\mathbf{X} = \mathbf{0}$$ of the system is asymptotically stable.

As it can be seen, the total derivative $${\large\frac{{dV}}{{dt}}\normalsize}$$ must be strictly negative (negative definite) in a neighborhood of the origin for the asymptotic stability of the zero solution.

### Lyapunov instability theorem.

Suppose that in a neighborhood $$U$$ of the zero solution $$\mathbf{X} = \mathbf{0}$$ there is a continuously differentiable function $$V\left( \mathbf{X} \right)$$ such that

1. $$V\left( \mathbf{0} \right) = 0$$;
2. $${\large\frac{{dV}}{{dt}}\normalsize} \gt 0$$.

If in the neighborhood $$U$$ there are points at which $$V\left( \mathbf{X} \right) \gt 0,$$ then the zero solution $$\mathbf{X} = \mathbf{0}$$ is unstable.

### Chetaev instability theorem.

Suppose that in a neighborhood $$U$$ of the zero solution $$\mathbf{X} = \mathbf{0}$$ of an autonomous system there exists a continuously differentiable function $$V\left( \mathbf{X} \right).$$ Let the neighborhood $$U$$ contain a subdomain $${U_1},$$ including the origin (Figure $$3$$) such that

1. $$V\left( \mathbf{X} \right) \gt 0$$ for all $$\mathbf{X} \in {U_1}\backslash \left\{ \mathbf{0} \right\}$$;
2. $${\large\frac{{dV}}{{dt}}\normalsize} \gt 0$$ for all $$\mathbf{X} \in {U_1}\backslash \left\{ \mathbf{0} \right\}$$;
3. $$V\left( \mathbf{X} \right) = 0$$ for all $$\mathbf{X} \in \delta {U_1},$$ where $$\delta {U_1}$$ denotes the boundary of the subdomain $${U_1}$$.

Then the zero solution $$\mathbf{X} = \mathbf{0}$$ of the system is unstable. In this case, the phase trajectories in the subdomain $${U_1}$$ will move away from the origin.

Thus, Lyapunov functions allow to determine the stability or instability of a system. The advantage of this method is that we do not need to know the actual solution $$\mathbf{X}\left( t \right).$$ In addition, this method allows to study the stability of equilibrium points of non-rough systems, for example, in the case when the equilibrium point is a center. The disadvantage is that there is no general method of constructing Lyapunov functions. In the particular case of homogeneous autonomous systems with constant coefficients, the Lyapunov function can be found as a quadratic form.

## Solved Problems

Click or tap a problem to see the solution.

### Example 1

Investigate the stability of the zero solution of the system
${\frac{{dx}}{{dt}} = – 2x,\;\;}\kern-0.3pt{\frac{{dy}}{{dt}} = x – y.}$

### Example 2

Investigate the stability of the zero solution of the system
${\frac{{dx}}{{dt}} = y,\;\;}\kern-0.3pt{\frac{{dy}}{{dt}} = – x.}$

### Example 3

Investigate the stability of the zero solution of the nonlinear system
${\frac{{dx}}{{dt}} = – x{y^2},\;\;}\kern-0.3pt{\frac{{dy}}{{dt}} = 3y{x^2}.}$

### Example 4

Investigate the stability of the zero solution of the system using the method of Lyapunov functions.
${\frac{{dx}}{{dt}} = y – 2x,\;\;}\kern-0.3pt{\frac{{dy}}{{dt}} = 2x – y – {x^3}.}$

### Example 5

Using a Lyapunov function, investigate the stability of the zero solution of the system
${\frac{{dx}}{{dt}} = x + 3y,\;\;}\kern-0.3pt{\frac{{dy}}{{dt}} = 2x.}$

### Example 6

Investigate the stability of the zero solution of the system
${\frac{{dx}}{{dt}} = {x^3} + y,\;\;}\kern-0.3pt{\frac{{dy}}{{dt}} = x + {y^3}.}$
Page 1
Concept
Page 2
Problems 1-6