Differentiation rules
Part of a series of articles about |
Calculus |
---|
This is a summary of differentiation rules, that is, rules for computing the derivative of a function in calculus.
Elementary rules of differentiation
Unless otherwise stated, all functions are functions of real numbers (R) that return real values; although more generally, the formulae below apply wherever they are well defined[1][2] — including the case of complex numbers (C).[3]
Constant term rule
For any value of [math]\displaystyle{ c }[/math], where [math]\displaystyle{ c \in \mathbb{R} }[/math], if [math]\displaystyle{ f(x) }[/math] is the constant function given by [math]\displaystyle{ f(x) = c }[/math], then [math]\displaystyle{ \frac{df}{dx} = 0 }[/math].[4]
Proof
Let [math]\displaystyle{ c \in \mathbb{R} }[/math] and [math]\displaystyle{ f(x) = c }[/math]. By the definition of the derivative,
- [math]\displaystyle{ \begin{align} f'(x) &= \lim_{h \to 0}\frac{f(x + h) - f(x)}{h} \\ &= \lim_{h \to 0} \frac{(c) - (c)}{h} \\ &= \lim_{h \to 0} \frac{0}{h} \\ &= \lim_{h \to 0} 0 \\ &= 0 \end{align} }[/math]
This shows that the derivative of any constant function is 0.
Intuitive (geometric) explanation
The derivative of the function at a point is the slope of the line tangent to the curve at the point. Slope of the constant function is zero, because the tangent line to the constant function is horizontal and it's angle is zero.
In other words, the value of the constant function, y, will not change as the value of x increases or decreases.
Differentiation is linear
For any functions [math]\displaystyle{ f }[/math] and [math]\displaystyle{ g }[/math] and any real numbers [math]\displaystyle{ a }[/math] and [math]\displaystyle{ b }[/math], the derivative of the function [math]\displaystyle{ h(x) = af(x) + bg(x) }[/math] with respect to [math]\displaystyle{ x }[/math] is: [math]\displaystyle{ h'(x) = a f'(x) + b g'(x). }[/math]
In Leibniz's notation this is written as: [math]\displaystyle{ \frac{d(af+bg)}{dx} = a\frac{df}{dx} +b\frac{dg}{dx}. }[/math]
Special cases include:
- The constant factor rule [math]\displaystyle{ (af)' = af' }[/math]
- The sum rule [math]\displaystyle{ (f + g)' = f' + g' }[/math]
- The difference rule [math]\displaystyle{ (f - g)' = f' - g'. }[/math]
The product rule
For the functions f and g, the derivative of the function h(x) = f(x) g(x) with respect to x is [math]\displaystyle{ h'(x) = (fg)'(x) = f'(x) g(x) + f(x) g'(x). }[/math] In Leibniz's notation this is written [math]\displaystyle{ \frac{d(fg)}{dx} = g \frac{df}{dx} + f \frac{dg}{dx}. }[/math]
The chain rule
The derivative of the function [math]\displaystyle{ h(x) = f(g(x)) }[/math] is [math]\displaystyle{ h'(x) = f'(g(x))\cdot g'(x). }[/math]
In Leibniz's notation, this is written as: [math]\displaystyle{ \frac{d}{dx}h(x) = \left.\frac{d}{dz}f(z)\right|_{z=g(x)}\cdot \frac{d}{dx}g(x), }[/math] often abridged to [math]\displaystyle{ \frac{dh(x)}{dx} = \frac{df(g(x))}{dg(x)} \cdot \frac{dg(x)}{dx}. }[/math]
Focusing on the notion of maps, and the differential being a map [math]\displaystyle{ \text{D} }[/math], this is written in a more concise way as: [math]\displaystyle{ [\text{D} (f\circ g)]_x = [\text{D} f]_{g(x)} \cdot [\text{D}g]_x\,. }[/math]
The inverse function rule
If the function f has an inverse function g, meaning that [math]\displaystyle{ g(f(x)) = x }[/math] and [math]\displaystyle{ f(g(y)) = y, }[/math] then [math]\displaystyle{ g' = \frac{1}{f'\circ g}. }[/math]
In Leibniz notation, this is written as [math]\displaystyle{ \frac{dx}{dy} = \frac{1}{\frac{dy}{dx}}. }[/math]
Power laws, polynomials, quotients, and reciprocals
The polynomial or elementary power rule
If [math]\displaystyle{ f(x) = x^r }[/math], for any real number [math]\displaystyle{ r \neq 0, }[/math] then
- [math]\displaystyle{ f'(x) = rx^{r-1}. }[/math]
When [math]\displaystyle{ r = 1, }[/math] this becomes the special case that if [math]\displaystyle{ f(x) = x, }[/math] then [math]\displaystyle{ f'(x) = 1. }[/math]
Combining the power rule with the sum and constant multiple rules permits the computation of the derivative of any polynomial.
The reciprocal rule
The derivative of [math]\displaystyle{ h(x)=\frac{1}{f(x)} }[/math]for any (nonvanishing) function f is:
- [math]\displaystyle{ h'(x) = -\frac{f'(x)}{(f(x))^2} }[/math] wherever f is non-zero.
In Leibniz's notation, this is written
- [math]\displaystyle{ \frac{d(1/f)}{dx} = -\frac{1}{f^2}\frac{df}{dx}. }[/math]
The reciprocal rule can be derived either from the quotient rule, or from the combination of power rule and chain rule.
The quotient rule
If f and g are functions, then:
- [math]\displaystyle{ \left(\frac{f}{g}\right)' = \frac{f'g - g'f}{g^2}\quad }[/math] wherever g is nonzero.
This can be derived from the product rule and the reciprocal rule.
Generalized power rule
The elementary power rule generalizes considerably. The most general power rule is the functional power rule: for any functions f and g,
- [math]\displaystyle{ (f^g)' = \left(e^{g\ln f}\right)' = f^g\left(f'{g \over f} + g'\ln f\right),\quad }[/math]
wherever both sides are well defined.
Special cases
- If [math]\displaystyle{ f(x)=x^a\! }[/math], then [math]\displaystyle{ f'(x)=ax^{a-1} }[/math] when a is any non-zero real number and x is positive.
- The reciprocal rule may be derived as the special case where [math]\displaystyle{ g(x)=-1\! }[/math].
Derivatives of exponential and logarithmic functions
- [math]\displaystyle{ \frac{d}{dx}\left(c^{ax}\right) = {ac^{ax} \ln c } ,\qquad c \gt 0 }[/math]
the equation above is true for all c, but the derivative for [math]\displaystyle{ c\lt 0 }[/math] yields a complex number.
- [math]\displaystyle{ \frac{d}{dx}\left(e^{ax}\right) = ae^{ax} }[/math]
- [math]\displaystyle{ \frac{d}{dx}\left( \log_c x\right) = {1 \over x \ln c} , \qquad c \gt 1 }[/math]
the equation above is also true for all c, but yields a complex number if [math]\displaystyle{ c\lt 0\! }[/math].
- [math]\displaystyle{ \frac{d}{dx}\left( \ln x\right) = {1 \over x} ,\qquad x \gt 0. }[/math]
- [math]\displaystyle{ \frac{d}{dx}\left( \ln |x|\right) = {1 \over x} ,\qquad x \neq 0. }[/math]
- [math]\displaystyle{ \frac{d}{dx}\left( W(x)\right) = {1 \over {x+e^{W(x)}}} ,\qquad x \gt -{1 \over e}.\qquad }[/math]where [math]\displaystyle{ W(x) }[/math] is the Lambert W function
- [math]\displaystyle{ \frac{d}{dx}\left( x^x \right) = x^x(1+\ln x). }[/math]
- [math]\displaystyle{ \frac{d}{dx}\left( f(x)^{ g(x) } \right ) = g(x)f(x)^{g(x)-1} \frac{df}{dx} + f(x)^{g(x)}\ln{( f(x) )}\frac{dg}{dx}, \qquad \text{if }f(x) \gt 0, \text{ and if } \frac{df}{dx} \text{ and } \frac{dg}{dx} \text{ exist.} }[/math]
- [math]\displaystyle{ \frac{d}{dx}\left( f_{1}(x)^{f_{2}(x)^{\left ( ... \right )^{f_{n}(x)}}} \right ) = \left [\sum\limits_{k=1}^{n} \frac{\partial }{\partial x_{k}} \left( f_{1}(x_1)^{f_{2}(x_2)^{\left ( ... \right )^{f_{n}(x_n)}}} \right ) \right ] \biggr\vert_{x_1 = x_2 = ... =x_n = x}, \text{ if } f_{i\lt n}(x) \gt 0 \text{ and } }[/math] [math]\displaystyle{ \frac{df_{i}}{dx} \text{ exists. } }[/math]
Logarithmic derivatives
The logarithmic derivative is another way of stating the rule for differentiating the logarithm of a function (using the chain rule):
- [math]\displaystyle{ (\ln f)'= \frac{f'}{f} \quad }[/math] wherever f is positive.
Logarithmic differentiation is a technique which uses logarithms and its differentiation rules to simplify certain expressions before actually applying the derivative.[citation needed]
Logarithms can be used to remove exponents, convert products into sums, and convert division into subtraction — each of which may lead to a simplified expression for taking derivatives.
Derivatives of trigonometric functions
[math]\displaystyle{ (\sin x)' = \cos x = \frac{e^{ix} + e^{-ix}}{2} }[/math] | [math]\displaystyle{ (\arcsin x)' = { 1 \over \sqrt{1 - x^2}} }[/math] |
[math]\displaystyle{ (\cos x)' = -\sin x = \frac{e^{-ix} - e^{ix}}{2i} }[/math] | [math]\displaystyle{ (\arccos x)' = -{1 \over \sqrt{1 - x^2}} }[/math] |
[math]\displaystyle{ (\tan x)' = \sec^2 x = { 1 \over \cos^2 x} = 1 + \tan^2 x }[/math] | [math]\displaystyle{ (\arctan x)' = { 1 \over 1 + x^2} }[/math] |
[math]\displaystyle{ (\cot x)' = -\csc^2 x = -{ 1 \over \sin^2 x} = -1 - \cot^2 x }[/math] | [math]\displaystyle{ (\operatorname{arccot} x)' = {1 \over -1 - x^2} }[/math] |
[math]\displaystyle{ (\sec x)' = \sec{x}\tan{x} }[/math] | [math]\displaystyle{ (\operatorname{arcsec} x)' = { 1 \over |x|\sqrt{x^2 - 1}} }[/math] |
[math]\displaystyle{ (\csc x)' = -\csc{x}\cot{x} }[/math] | [math]\displaystyle{ (\operatorname{arccsc} x)' = -{1 \over |x|\sqrt{x^2 - 1}} }[/math] |
The derivatives in the table above are for when the range of the inverse secant is [math]\displaystyle{ [0,\pi]\! }[/math] and when the range of the inverse cosecant is [math]\displaystyle{ \left[-\frac{\pi}{2},\frac{\pi}{2}\right]\! }[/math].
It is common to additionally define an inverse tangent function with two arguments, [math]\displaystyle{ \arctan(y,x)\! }[/math]. Its value lies in the range [math]\displaystyle{ [-\pi,\pi]\! }[/math] and reflects the quadrant of the point [math]\displaystyle{ (x,y)\! }[/math]. For the first and fourth quadrant (i.e. [math]\displaystyle{ x \gt 0\! }[/math]) one has [math]\displaystyle{ \arctan(y, x\gt 0) = \arctan(y/x)\! }[/math]. Its partial derivatives are
[math]\displaystyle{ \frac{\partial \arctan(y,x)}{\partial y} = \frac{x}{x^2 + y^2} }[/math], and [math]\displaystyle{ \frac{\partial \arctan(y,x)}{\partial x} = \frac{-y}{x^2 + y^2}. }[/math] |
Derivatives of hyperbolic functions
[math]\displaystyle{ ( \sinh x )'= \cosh x = \frac{e^x + e^{-x}}{2} }[/math] | [math]\displaystyle{ (\operatorname{arcsinh}x)' = { 1 \over \sqrt{1 + x^2}} }[/math] |
[math]\displaystyle{ (\cosh x )'= \sinh x = \frac{e^x - e^{-x}}{2} }[/math] | [math]\displaystyle{ (\operatorname{arccosh}x)' = {\frac {1}{\sqrt{x^2-1}}} }[/math] |
[math]\displaystyle{ (\tanh x )'= {\operatorname{sech}^2x} = { 1 \over \cosh^2 x} = 1 - \tanh^2 x }[/math] | [math]\displaystyle{ (\operatorname{arctanh}x)' = { 1 \over 1 - x^2} }[/math] |
[math]\displaystyle{ (\coth x )' = -\operatorname{csch}^2x = -{ 1 \over \sinh^2 x} = 1 - \coth^2 x }[/math] | [math]\displaystyle{ (\operatorname{arccoth}x)' =(\operatorname{arctanh}\frac{1}{x})'= \frac {-x^{-2}} {1-\frac {1}{x^2}}=\frac{1}{1-x^2} }[/math] |
[math]\displaystyle{ (\operatorname{sech} x)' = -\operatorname{sech}{x}\tanh{x} }[/math] | [math]\displaystyle{ (\operatorname{arcsech}x)' = -{1 \over x\sqrt{1 - x^2}} }[/math] |
[math]\displaystyle{ (\operatorname{csch}x)' = -\operatorname{csch}{x}\coth{x} }[/math] | [math]\displaystyle{ (\operatorname{arccsch}x)' = -{1 \over |x|\sqrt{1 + x^2}} }[/math] |
See Hyperbolic functions for restrictions on these derivatives.
Derivatives of special functions
- Gamma function
- [math]\displaystyle{ \Gamma(x) = \int_0^\infty t^{x-1} e^{-t}\, dt }[/math]
- [math]\displaystyle{ \begin{align} \Gamma'(x) & = \int_0^\infty t^{x-1} e^{-t} \ln t\,dt \\ & = \Gamma(x) \left(\sum_{n=1}^\infty \left(\ln\left(1 + \dfrac{1}{n}\right) - \dfrac{1}{x + n}\right) - \dfrac{1}{x}\right) \\ & = \Gamma(x) \psi(x) \end{align} }[/math] with [math]\displaystyle{ \psi(x) }[/math] being the digamma function, expressed by the parenthesized expression to the right of [math]\displaystyle{ \Gamma(x) }[/math] in the line above.
- Riemann zeta function
- [math]\displaystyle{ \zeta(x) = \sum_{n=1}^\infty \frac{1}{n^x} }[/math]
- [math]\displaystyle{ \begin{align} \zeta'(x) & = -\sum_{n=1}^\infty \frac{\ln n}{n^x} =-\frac{\ln 2}{2^x} - \frac{\ln 3}{3^x} - \frac{\ln 4}{4^x} - \cdots \\ & = -\sum_{p \text{ prime}} \frac{p^{-x} \ln p}{(1-p^{-x})^2} \prod_{q \text{ prime}, q \neq p} \frac{1}{1-q^{-x}} \end{align} }[/math]
Derivatives of integrals
Suppose that it is required to differentiate with respect to x the function
- [math]\displaystyle{ F(x)=\int_{a(x)}^{b(x)}f(x,t)\,dt, }[/math]
where the functions [math]\displaystyle{ f(x,t) }[/math] and [math]\displaystyle{ \frac{\partial}{\partial x}\,f(x,t) }[/math] are both continuous in both [math]\displaystyle{ t }[/math] and [math]\displaystyle{ x }[/math] in some region of the [math]\displaystyle{ (t,x) }[/math] plane, including [math]\displaystyle{ a(x)\leq t\leq b(x), }[/math] [math]\displaystyle{ x_0\leq x\leq x_1 }[/math], and the functions [math]\displaystyle{ a(x) }[/math] and [math]\displaystyle{ b(x) }[/math] are both continuous and both have continuous derivatives for [math]\displaystyle{ x_0\leq x\leq x_1 }[/math]. Then for [math]\displaystyle{ \,x_0\leq x\leq x_1 }[/math]:
- [math]\displaystyle{ F'(x) = f(x,b(x))\,b'(x) - f(x,a(x))\,a'(x) + \int_{a(x)}^{b(x)} \frac{\partial}{\partial x}\, f(x,t)\; dt\,. }[/math]
This formula is the general form of the Leibniz integral rule and can be derived using the fundamental theorem of calculus.
Derivatives to nth order
Some rules exist for computing the n-th derivative of functions, where n is a positive integer. These include:
Faà di Bruno's formula
If f and g are n-times differentiable, then [math]\displaystyle{ \frac{d^n}{d x^n} [f(g(x))]= n! \sum_{\{k_m\}} f^{(r)}(g(x)) \prod_{m=1}^n \frac{1}{k_m!} \left(g^{(m)}(x) \right)^{k_m} }[/math] where [math]\displaystyle{ r = \sum_{m=1}^{n-1} k_m }[/math] and the set [math]\displaystyle{ \{k_m\} }[/math] consists of all non-negative integer solutions of the Diophantine equation [math]\displaystyle{ \sum_{m=1}^{n} m k_m = n }[/math].
General Leibniz rule
If f and g are n-times differentiable, then [math]\displaystyle{ \frac{d^n}{dx^n}[f(x)g(x)] = \sum_{k=0}^{n} \binom{n}{k} \frac{d^{n-k}}{d x^{n-k}} f(x) \frac{d^k}{d x^k} g(x) }[/math]
See also
- Differentiable function – Mathematical function whose derivative exists
- Differential of a function – Notion in calculus
- Differentiation of integrals – Problem in mathematics
- Hyperbolic functions – Collective name of 6 mathematical functions
- Inverse trigonometric functions – Inverse functions of sin, cos, tan, etc.
- Lists of integrals – None
- List of mathematical functions
- Matrix calculus – Specialized notation for multivariable calculus
- Trigonometric functions – Functions of an angle
- Vector calculus identities – Mathematical identities
References
- ↑ Calculus (5th edition), F. Ayres, E. Mendelson, Schaum's Outline Series, 2009, ISBN:978-0-07-150861-2.
- ↑ Advanced Calculus (3rd edition), R. Wrede, M.R. Spiegel, Schaum's Outline Series, 2010, ISBN:978-0-07-162366-7.
- ↑ Complex Variables, M.R. Spiegel, S. Lipschutz, J.J. Schiller, D. Spellman, Schaum's Outlines Series, McGraw Hill (USA), 2009, ISBN:978-0-07-161569-3
- ↑ "Differentiation Rules". https://courseware.cemc.uwaterloo.ca/11/assignments/47/6.
Sources and further reading
These rules are given in many books, both on elementary and advanced calculus, in pure and applied mathematics. Those in this article (in addition to the above references) can be found in:
- Mathematical Handbook of Formulas and Tables (3rd edition), S. Lipschutz, M.R. Spiegel, J. Liu, Schaum's Outline Series, 2009, ISBN:978-0-07-154855-7.
- The Cambridge Handbook of Physics Formulas, G. Woan, Cambridge University Press, 2010, ISBN:978-0-521-57507-2.
- Mathematical methods for physics and engineering, K.F. Riley, M.P. Hobson, S.J. Bence, Cambridge University Press, 2010, ISBN:978-0-521-86153-3
- NIST Handbook of Mathematical Functions, F. W. J. Olver, D. W. Lozier, R. F. Boisvert, C. W. Clark, Cambridge University Press, 2010, ISBN:978-0-521-19225-5.
External links
Library resources about Differentiation rules |
Original source: https://en.wikipedia.org/wiki/Differentiation rules.
Read more |