Ramp function
The ramp function is a unary real function, whose graph is shaped like a ramp. It can be expressed by numerous definitions, for example "0 for negative inputs, output equals input for non-negative inputs". The term "ramp" can also be used for other functions obtained by scaling and shifting, and the function in this article is the unit ramp function (slope 1, starting at 0).
In mathematics, the ramp function is also known as the positive part.
In machine learning, it is commonly known as a ReLU activation function[1][2] or a rectifier in analogy to half-wave rectification in electrical engineering. In statistics (when used as a likelihood function) it is known as a tobit model.
This function has numerous applications in mathematics and engineering, and goes by various names, depending on the context. There are differentiable variants of the ramp function.
Definitions
The ramp function (R(x) : R → R0+) may be defined analytically in several ways. Possible definitions are:
- A piecewise function: [math]\displaystyle{ R(x) := \begin{cases} x, & x \ge 0; \\ 0, & x\lt 0 \end{cases} }[/math]
- The max function: [math]\displaystyle{ R(x) := \max(x,0) }[/math]
- The mean of an independent variable and its absolute value (a straight line with unity gradient and its modulus): [math]\displaystyle{ R(x) := \frac{x+|x|}{2} }[/math] this can be derived by noting the following definition of max(a, b), [math]\displaystyle{ \max(a,b) = \frac{a + b + |a - b|}{2} }[/math] for which a = x and b = 0
- The Heaviside step function multiplied by a straight line with unity gradient: [math]\displaystyle{ R\left( x \right) := x H(x) }[/math]
- The convolution of the Heaviside step function with itself: [math]\displaystyle{ R\left( x \right) := H(x) * H(x) }[/math]
- The integral of the Heaviside step function:[3] [math]\displaystyle{ R(x) := \int_{-\infty}^{x} H(\xi)\,d\xi }[/math]
- Macaulay brackets: [math]\displaystyle{ R(x) := \langle x\rangle }[/math]
- The positive part of the identity function: [math]\displaystyle{ R := \operatorname{id}^+ }[/math]
Applications
The ramp function has numerous applications in engineering, such as in the theory of digital signal processing.
In finance, the payoff of a call option is a ramp (shifted by strike price). Horizontally flipping a ramp yields a put option, while vertically flipping (taking the negative) corresponds to selling or being "short" an option. In finance, the shape is widely called a "hockey stick", due to the shape being similar to an ice hockey stick.
In statistics, hinge functions of multivariate adaptive regression splines (MARS) are ramps, and are used to build regression models.
Analytic properties
Non-negativity
In the whole domain the function is non-negative, so its absolute value is itself, i.e. [math]\displaystyle{ \forall x \in \Reals: R(x) \geq 0 }[/math] and [math]\displaystyle{ \left| R (x) \right| = R(x) }[/math]
by the mean of definition 2, it is non-negative in the first quarter, and zero in the second; so everywhere it is non-negative.
Derivative
Its derivative is the Heaviside step function: [math]\displaystyle{ R'(x) = H(x)\quad \mbox{for } x \ne 0. }[/math]
Second derivative
The ramp function satisfies the differential equation: [math]\displaystyle{ \frac{d^2}{dx^2} R(x - x_0) = \delta(x - x_0), }[/math] where δ(x) is the Dirac delta. This means that R(x) is a Green's function for the second derivative operator. Thus, any function, f(x), with an integrable second derivative, f″(x), will satisfy the equation: [math]\displaystyle{ f(x) = f(a) + (x-a) f'(a) + \int_{a}^b R(x - s) f''(s) \,ds \quad \mbox{for }a \lt x \lt b . }[/math]
Fourier transform
[math]\displaystyle{ \mathcal{F}\big\{ R(x) \big\}(f) = \int_{-\infty}^{\infty} R(x) e^{-2\pi ifx} \, dx = \frac{i\delta '(f)}{4\pi}-\frac{1}{4 \pi^2 f^2}, }[/math] where δ(x) is the Dirac delta (in this formula, its derivative appears).
Laplace transform
The single-sided Laplace transform of R(x) is given as follows,[4] [math]\displaystyle{ \mathcal{L}\big\{R(x)\big\} (s) = \int_{0}^{\infty} e^{-sx}R(x)dx = \frac{1}{s^2}. }[/math]
Algebraic properties
Iteration invariance
Every iterated function of the ramp mapping is itself, as [math]\displaystyle{ R \big( R(x) \big) = R(x) . }[/math]
[math]\displaystyle{ R \big( R(x) \big) := \frac{R(x)+|R(x)|}{2} = \frac{R(x)+R(x)}{2} = R(x) . }[/math] This applies the non-negative property.
See also
References
- ↑ Brownlee, Jason (8 January 2019). "A Gentle Introduction to the Rectified Linear Unit (ReLU)". https://machinelearningmastery.com/rectified-linear-activation-function-for-deep-learning-neural-networks/.
- ↑ Liu, Danqing (30 November 2017). "A Practical Guide to ReLU" (in en). https://medium.com/@danqing/a-practical-guide-to-relu-b83ca804f1f7.
- ↑ Weisstein, Eric W.. "Ramp Function". http://mathworld.wolfram.com/RampFunction.html.
- ↑ "The Laplace Transform of Functions". https://lpsa.swarthmore.edu/LaplaceXform/FwdLaplace/LaplaceFuncs.html#Ramp.
Original source: https://en.wikipedia.org/wiki/Ramp function.
Read more |