Hörmander's condition
In mathematics, Hörmander's condition is a property of vector fields that, if satisfied, has many useful consequences in the theory of partial and stochastic differential equations. The condition is named after the Swedish mathematician Lars Hörmander.
Definition
Given two C1 vector fields V and W on d-dimensional Euclidean space Rd, let [V, W] denote their Lie bracket, another vector field defined by
- [math]\displaystyle{ [V, W] (x) = \mathrm{D} V(x) W(x) - \mathrm{D} W(x) V(x), }[/math]
where DV(x) denotes the Fréchet derivative of V at x ∈ Rd, which can be thought of as a matrix that is applied to the vector W(x), and vice versa.
Let A0, A1, ... An be vector fields on Rd. They are said to satisfy Hörmander's condition if, for every point x ∈ Rd, the vectors
- [math]\displaystyle{ \begin{align} &A_{j_0} (x)~,\\ &[A_{j_{0}} (x), A_{j_{1}} (x)]~,\\ &[[A_{j_{0}} (x), A_{j_{1}} (x)], A_{j_{2}} (x)]~,\\ &\quad\vdots\quad \end{align} \qquad 0 \leq j_{0}, j_{1}, \ldots, j_{n} \leq n }[/math]
span Rd. They are said to satisfy the parabolic Hörmander condition if the same holds true, but with the index [math]\displaystyle{ j_0 }[/math] taking only values in 1,...,n.
Application to stochastic differential equations
Consider the stochastic differential equation (SDE)
- [math]\displaystyle{ \operatorname dx = A_0(x) \operatorname dt + \sum_{i=1}^n A_i(x) \circ \operatorname dW_i }[/math]
where the vectors fields [math]\displaystyle{ A_0,\dotsc,A_n }[/math] are assumed to have bounded derivative, [math]\displaystyle{ (W_1,\dotsc,W_n) }[/math] the normalized n-dimensional Brownian motion and [math]\displaystyle{ \circ\operatorname d }[/math] stands for the Stratonovich integral interpretation of the SDE. Hörmander's theorem asserts that if the SDE above satisfies the parabolic Hörmander condition, then its solutions admit a smooth density with respect to Lebesgue measure.
Application to the Cauchy problem
With the same notation as above, define a second-order differential operator F by
- [math]\displaystyle{ F = \frac1{2} \sum_{i = 1}^n A_i^2 + A_0. }[/math]
An important problem in the theory of partial differential equations is to determine sufficient conditions on the vector fields Ai for the Cauchy problem
- [math]\displaystyle{ \begin{cases} \dfrac{\partial u}{\partial t} (t, x) = F u(t, x), & t \gt 0, x \in \mathbf{R}^{d}; \\ u(t, \cdot) \to f, & \text{as } t \to 0; \end{cases} }[/math]
to have a smooth fundamental solution, i.e. a real-valued function p (0, +∞) × R2d → R such that p(t, ·, ·) is smooth on R2d for each t and
- [math]\displaystyle{ u(t, x) = \int_{\mathbf{R}^{d}} p(t, x, y) f(y) \, \mathrm{d} y }[/math]
satisfies the Cauchy problem above. It had been known for some time that a smooth solution exists in the elliptic case, in which
- [math]\displaystyle{ A_{i} = \sum_{j = 1}^{d} a_{ji} \frac{\partial}{\partial x_{j}}, }[/math]
and the matrix A = (aji), 1 ≤ j ≤ d, 1 ≤ i ≤ n is such that AA∗ is everywhere an invertible matrix.
The great achievement of Hörmander's 1967 paper was to show that a smooth fundamental solution exists under a considerably weaker assumption: the parabolic version of the condition that now bears his name.
Application to control systems
Let M be a smooth manifold and [math]\displaystyle{ A_0,\dotsc,A_n }[/math] be smooth vector fields on M. Assuming that these vector fields satisfy Hörmander's condition, then the control system
- [math]\displaystyle{ \dot{x} = \sum_{i=0}^{n} u_{i} A_{i}(x) }[/math]
is locally controllable in any time at every point of M. This is known as the Chow–Rashevskii theorem. See Orbit (control theory).
See also
References
- Bell, Denis R. (2006). The Malliavin calculus. Mineola, NY: Dover Publications Inc.. pp. x+113. ISBN 0-486-44994-7. MR2250060 (See the introduction)
- Hörmander, Lars (1967). "Hypoelliptic second order differential equations". Acta Math. 119: 147–171. doi:10.1007/BF02392081. ISSN 0001-5962. MR0222474
Original source: https://en.wikipedia.org/wiki/Hörmander's condition.
Read more |