### Definition of the Lie Derivative and First Integral

Consider an \(n\)th order system

\[{\frac{{d{x_i}}}{{dt}} = {f_i}\left( {t,{x_1},{x_2}, \ldots ,{x_n}} \right),\;\;}\kern-0.3pt{i = 1,2, \ldots ,n,}\]

where \({f_i}\left( {t,{x_1},{x_2}, \ldots ,{x_n}} \right)\) are continuously differentiable real functions defined in a domain \(D \in {\Re^{n + 1}}.\) In vector form, the system can be written as

\[ {\mathbf{X’} = \mathbf{f}\left( {t,\mathbf{X}} \right),\;\;}\kern-0.3pt{\text{where}\;\;}\kern-0.3pt {\mathbf{X} = \left[ {\begin{array}{*{20}{c}} {{x_1}\left( t \right)}\\ {{x_2}\left( t \right)}\\ \vdots \\ {{x_n}\left( t \right)} \end{array}} \right],\;\;}\kern-0.3pt {\mathbf{f} = \left[ {\begin{array}{*{20}{c}} {{f_1}}\\ {{f_2}}\\ \vdots \\ {{f_n}} \end{array}} \right].} \]

Let also a continuously differentiable vector function \(\mathbf{U}\left( {t,\mathbf{X}} \right)\) be defined in the domain \(D.\) The derivative of the vector function \(\mathbf{U}\left( {t,\mathbf{X}} \right)\) along a vector field \(\mathbf{f}\left( {t,\mathbf{X}} \right)\) (the Lie derivative) is given by the expression

\[

{{L_{\mathbf{f}}}\mathbf{U} = \left( {\text{grad}\,\mathbf{U},\mathbf{f}} \right) }

= {\frac{{\partial \mathbf{U}}}{{\partial t}} + \sum\limits_{i = 1}^n {\frac{{\partial \mathbf{U}}}{{\partial {x_i}}}{f_i}} }={ \frac{{d\mathbf{U}}}{{dt}},}

\]

where \(\text{grad}\,\mathbf{U}\) is the gradient of the function \(U\) and \(\left( {\text{grad}\,\mathbf{U},\mathbf{f}} \right)\) denotes the scalar (dot) product of the vectors \(\text{grad}\,\mathbf{U}\) and \(\mathbf{f}.\)

The directional derivative of a vector function (the Lie derivative) is a generalization of the derivative along a constant direction, which is widely used in the study of functions of several variables.

If a non-constant function \(\mathbf{U}\left( {t,\mathbf{X}} \right)\) satisfies the relationship

\[{L_\mathbf{f}}\mathbf{U} \equiv 0\]

for all \(\mathbf{X} \in D,\) then it is called a first integral of the system.

In the case of autonomous systems (where the right-hand sides \({f_i}\) do not depend explicitly on the variable \(t\)), the first integral is defined by the simple expression:

\[

{{L_\mathbf{f}}\mathbf{U} \equiv 0,\;\;}\Rightarrow

{\frac{{d\mathbf{U}}}{{dt}} = \sum\limits_{i = 1}^n {\frac{{\partial \mathbf{U}}}{{\partial {x_i}}}{f_i}} \equiv 0,\;\;}\Rightarrow

{\mathbf{U}\left( \mathbf{X} \right) \equiv C,}

\]

where \(C\) is a constant. Next, we restrict ourselves to the autonomous systems.

As one can see, the first integral remains constant along any solution \(\mathbf{X}\left( t \right).\) In other words, the phase trajectory \(\mathbf{X}\left( t \right)\) of the system lies on a level surface of the first integral of \(\mathbf{U}\left( \mathbf{X} \right).\) In the case of a second order system, this will be a level curve of the first integral.

Suppose that for an \(n\)th order autonomous system, \(k\) first integrals are found:

\[{{\mathbf{U}_1}\left( \mathbf{X} \right),{\mathbf{U}_2}\left( \mathbf{X} \right), \ldots ,}\kern-0.3pt{{\mathbf{U}_k}\left( \mathbf{X} \right),\;\;}\kern-0.3pt{k \lt n.}\]

It can be shown that the composition

\[{\Phi \left[ {{\mathbf{U}_1}\left( \mathbf{X} \right),{\mathbf{U}_2}\left( \mathbf{X} \right), \ldots ,}\right.}\kern0pt{\left.{{\mathbf{U}_k}\left( \mathbf{X} \right)} \right],}\]

where \(\Phi\) is an arbitrary continuously differentiable function, will also be a first integral of the system. In general, there is an infinite number of first integrals. From this set, one can identify functionally independent first integrals.

The first integrals \({{\mathbf{U}_1}\left( \mathbf{X} \right),}\) \({\mathbf{U}_2}\left( \mathbf{X} \right), \ldots ,\) \({\mathbf{U}_k}\left( \mathbf{X} \right)\) defined in the domain \(D \in {\Re^n}\) are called functionally independent if for all \(\mathbf{X} \in D\) the rank of the Jacobian matrix is equal to the number of functions \(k:\)

\[{\text{rank} \left[ {\begin{array}{*{20}{c}} {\frac{{\partial {U_1}}}{{\partial {x_1}}}}&{\frac{{\partial {U_1}}}{{\partial {x_2}}}}& \vdots &{\frac{{\partial {U_1}}}{{\partial {x_k}}}}\\ {\frac{{\partial {U_2}}}{{\partial {x_1}}}}&{\frac{{\partial {U_2}}}{{\partial {x_2}}}}& \vdots &{\frac{{\partial {U_2}}}{{\partial {x_k}}}}\\ \cdots & \cdots & \cdots & \cdots \\ {\frac{{\partial {U_k}}}{{\partial {x_1}}}}&{\frac{{\partial {U_k}}}{{\partial {x_2}}}}& \vdots &{\frac{{\partial {U_k}}}{{\partial {x_k}}}} \end{array}} \right] }={ k.}\]

For a second order autonomous system, there is one independent first integral that defines the solution in implicit form. For an \(n\)th order autonomous system, there exist \(n – 1\) independent integrals. If \(k\) independent first integrals of a system are known, the system’s order can be reduced to \(n – k.\) Finding first integrals is one of the main methods for solving nonlinear autonomous systems.

### Integrable Combinations

In order to find the first integrals, the equations of the system should be transformed with the help of suitable arithmetic operations into the form

\[{L_\mathbf{f}}\mathbf{U} = 0,\]

where the left side is the Lie derivative of a function \(\mathbf{U}\left( \mathbf{X} \right),\) while the right-hand side is zero. The first integral \(\mathbf{U}\left( \mathbf{X} \right)\) can be found by integrating this expression.

### Symmetric Form of the System of Differential Equations

To find the first integrals it is sometimes convenient to write the original system in the so-called symmetric form:

\[

{\frac{{d{x_1}}}{{{f_1}\left( {{x_1},{x_2}, \ldots ,{x_n}} \right)}} }={ \frac{{d{x_2}}}{{{f_2}\left( {{x_1},{x_2}, \ldots ,{x_n}} \right)}} = \cdots }

= {\frac{{d{x_n}}}{{{f_n}\left( {{x_1},{x_2}, \ldots ,{x_n}} \right)}} }={ \frac{{dt}}{1}.}

\]

Here we assume that the functions \({f_1},{f_2}, \ldots ,{f_n}\) in the denominators are not zero in the domain \(D \in {\Re^n}.\)

In this notation, some pairs of ratios may allow integration, for example, by separation of variables. Another way to solve the system in symmetric form is to use the equal fractions property

\[

{\frac{{{a_1}}}{{{b_1}}} = \frac{{{a_2}}}{{{b_2}}} = \cdots = \frac{{{a_n}}}{{{b_n}}} }

= {\frac{{{\lambda _1}{a_1} + {\lambda _2}{a_2} + \cdots + {\lambda _n}{a_n}}}{{{\lambda _1}{b_1} + {\lambda _2}{b_2} + \cdots + {\lambda _n}{b_n}}},}

\]

where we assume that \({{\lambda _1}{b_1} + {\lambda _2}{b_2} + \cdots +\;}\) \( {{\lambda _n}{b_n}} \ne 0,\) and the numbers \({\lambda _1},{\lambda _2}, \ldots ,{\lambda _n}\) are chosen so that the numerator is the differential of the denominator or equal to zero.

## Solved Problems

Click or tap a problem to see the solution.