Reduced derivative
In mathematics, the reduced derivative is a generalization of the notion of derivative that is well-suited to the study of functions of bounded variation. Although functions of bounded variation have derivatives in the sense of Radon measures, it is desirable to have a derivative that takes values in the same space as the functions themselves. Although the precise definition of the reduced derivative is quite involved, its key properties are quite easy to remember:
- it is a multiple of the usual derivative wherever it exists;
- at jump points, it is a multiple of the jump vector.
The notion of reduced derivative appears to have been introduced by Alexander Mielke and Florian Theil in 2004.
Definition
Let X be a separable, reflexive Banach space with norm || || and fix T > 0. Let BV−([0, T]; X) denote the space of all left-continuous functions z : [0, T] → X with bounded variation on [0, T].
For any function of time f, use subscripts +/− to denote the right/left continuous versions of f, i.e.
- [math]\displaystyle{ f_{+} (t) = \lim_{s \downarrow t} f(s); }[/math]
- [math]\displaystyle{ f_{-} (t) = \lim_{s \uparrow t} f(s). }[/math]
For any sub-interval [a, b] of [0, T], let Var(z, [a, b]) denote the variation of z over [a, b], i.e., the supremum
- [math]\displaystyle{ \mathrm{Var}(z, [a, b]) = \sup \left\{ \left. \sum_{i = 1}^{k} \| z(t_{i}) - z(t_{i - 1}) \| \right| a = t_{0} \lt t_{1} \lt \cdots \lt t_{k} = b, k \in \mathbb{N} \right\}. }[/math]
The first step in the construction of the reduced derivative is the "stretch" time so that z can be linearly interpolated at its jump points. To this end, define
- [math]\displaystyle{ \hat{\tau} \colon [0, T] \to [0, + \infty); }[/math]
- [math]\displaystyle{ \hat{\tau}(t) = t + \int_{[0, t]} \| \mathrm{d} z \| = t + \mathrm{Var}(z, [0, t]). }[/math]
The "stretched time" function τ̂ is left-continuous (i.e. τ̂ = τ̂−); moreover, τ̂− and τ̂+ are strictly increasing and agree except at the (at most countable) jump points of z. Setting T̂ = τ̂(T), this "stretch" can be inverted by
- [math]\displaystyle{ \hat{t} \colon [0, \hat{T}] \to [0, T]; }[/math]
- [math]\displaystyle{ \hat{t}(\tau) = \max \{ t \in [0, T] | \hat{\tau}(t) \leq \tau \}. }[/math]
Using this, the stretched version of z is defined by
- [math]\displaystyle{ \hat{z} \in C^{0} ([0, \hat{T}]; X); }[/math]
- [math]\displaystyle{ \hat{z}(\tau) = (1 - \theta) z_{-}(t) + \theta z_{+}(t) }[/math]
where θ ∈ [0, 1] and
- [math]\displaystyle{ \tau = (1 - \theta) \hat{\tau}_{-} (t) + \theta \hat{\tau}_{+} (t). }[/math]
The effect of this definition is to create a new function ẑ which "stretches out" the jumps of z by linear interpolation. A quick calculation shows that ẑ is not just continuous, but also lies in a Sobolev space:
- [math]\displaystyle{ \hat{z} \in W^{1, \infty} ([0, \hat{T}]; X); }[/math]
- [math]\displaystyle{ \left\| \frac{\mathrm{d} \hat{z}}{\mathrm{d} \tau} \right\|_{L^{\infty} ([0, \hat{T}]; X)} \leq 1. }[/math]
The derivative of ẑ(τ) with respect to τ is defined almost everywhere with respect to Lebesgue measure. The reduced derivative of z is the pull-back of this derivative by the stretching function τ̂ : [0, T] → [0, T̂]. In other words,
- [math]\displaystyle{ \mathrm{rd}(z) \colon [0, T] \to \{ x \in X | \| x \| \leq 1 \}; }[/math]
- [math]\displaystyle{ \mathrm{rd}(z)(t) = \frac{\mathrm{d} \hat{z}}{\mathrm{d} \tau} \left( \frac{\hat{\tau}_{-} (t) + \hat{\tau}_{+}(t)}{2} \right). }[/math]
Associated with this pull-back of the derivative is the pull-back of Lebesgue measure on [0, T̂], which defines the differential measure μz:
- [math]\displaystyle{ \mu_{z} ([t_{1}, t_{2})) = \lambda ([\hat{\tau}(t_{1}), \hat{\tau}(t_{2})) = \hat{\tau} (t_{2}) - \hat{\tau}(t_{1}) = t_{2} - t_{1} + \int_{[t_{1}, t_{2}]} \| \mathrm{d} z \|. }[/math]
Properties
- The reduced derivative rd(z) is defined only μz-almost everywhere on [0, T].
- If t is a jump point of z, then
- [math]\displaystyle{ \mu_{z} (\{ t \}) = \| z_{+}(t) - z_{-}(t) \| \mbox{ and } \mathrm{rd}(z)(t) = \frac{z_{+}(t) - z_{-}(t)}{\| z_{+}(t) - z_{-}(t) \|}. }[/math]
- If z is differentiable on (t1, t2), then
- [math]\displaystyle{ \mu_{z} ((t_{1}, t_{2})) = \int_{t_{1}}^{t_{2}} 1 + \| \dot{z}(t) \| \, \mathrm{d} t }[/math]
- and, for t ∈ (t1, t2),
- [math]\displaystyle{ \mathrm{rd}(z)(t) = \frac{\dot{z}(t)}{1 + \| \dot{z}(t) \|} }[/math],
- For 0 ≤ s < t ≤ T,
- [math]\displaystyle{ \int_{[s, t)} \mathrm{rd}(z)(r) \, \mathrm{d} \mu_{z} (r) = \int_{[s, t)} \mathrm{d} z = z(t) - z(s). }[/math]
References
- Mielke, Alexander; Theil, Florian (2004). "On rate-independent hysteresis models". NoDEA Nonlinear Differential Equations Appl. 11 (2): 151–189. doi:10.1007/s00030-003-1052-7. ISSN 1021-9722. MR2210284
Original source: https://en.wikipedia.org/wiki/Reduced derivative.
Read more |