Indefinite sum

From HandWiki

In discrete calculus the indefinite sum operator (also known as the antidifference operator), denoted by [math]\displaystyle{ \sum _x }[/math] or [math]\displaystyle{ \Delta^{-1} }[/math],[1][2] is the linear operator, inverse of the forward difference operator [math]\displaystyle{ \Delta }[/math]. It relates to the forward difference operator as the indefinite integral relates to the derivative. Thus

[math]\displaystyle{ \Delta \sum_x f(x) = f(x) \, . }[/math]

More explicitly, if [math]\displaystyle{ \sum_x f(x) = F(x) }[/math], then

[math]\displaystyle{ F(x+1) - F(x) = f(x) \, . }[/math]

If F(x) is a solution of this functional equation for a given f(x), then so is F(x)+C(x) for any periodic function C(x) with period 1. Therefore, each indefinite sum actually represents a family of functions. However, due to the Carlson's theorem, the solution equal to its Newton series expansion is unique up to an additive constant C. This unique solution can be represented by formal power series form of the antidifference operator: [math]\displaystyle{ \Delta^{-1}=\frac1{e^D-1} }[/math].

Fundamental theorem of discrete calculus

Indefinite sums can be used to calculate definite sums with the formula:[3]

[math]\displaystyle{ \sum_{k=a}^b f(k)=\Delta^{-1}f(b+1)-\Delta^{-1}f(a) }[/math]

Definitions

Laplace summation formula

[math]\displaystyle{ \sum _x f(x)=\int_0^x f(t) dt -\sum_{k=1}^\infty \frac{c_k\Delta^{k-1}f(x)}{k!} + C }[/math]
where [math]\displaystyle{ c_k=\int_0^1 \frac{\Gamma(x+1)}{\Gamma(x-k+1)}dx }[/math] are the Cauchy numbers of the first kind, also known as the Bernoulli Numbers of the Second Kind.[4]

Newton's formula

[math]\displaystyle{ \sum_x f(x)=\sum_{k=1}^\infty \binom{x}k \Delta^{k-1} [f]\left (0\right)+C=\sum_{k=1}^{\infty}\frac{\Delta^{k-1}[f](0)}{k!}(x)_k+C }[/math]
where [math]\displaystyle{ (x)_k=\frac{\Gamma(x+1)}{\Gamma(x-k+1)} }[/math] is the falling factorial.

Faulhaber's formula

[math]\displaystyle{ \sum _x f(x)= \sum_{n=1}^{\infty} \frac{f^{(n-1)} (0)}{n!} B_n(x) + C \, , }[/math]

provided that the right-hand side of the equation converges.

Mueller's formula

If [math]\displaystyle{ \lim_{x\to{+\infty}}f(x)=0, }[/math] then[5]

[math]\displaystyle{ \sum _x f(x)=\sum_{n=0}^\infty\left(f(n)-f(n+x)\right)+ C. }[/math]

Euler–Maclaurin formula

[math]\displaystyle{ \sum _x f(x)= \int_0^x f(t) dt - \frac12 f(x)+\sum_{k=1}^{\infty}\frac{B_{2k}}{(2k)!}f^{(2k-1)}(x) + C }[/math]

Choice of the constant term

Often the constant C in indefinite sum is fixed from the following condition.

Let

[math]\displaystyle{ F(x)=\sum _x f(x)+C }[/math]

Then the constant C is fixed from the condition

[math]\displaystyle{ \int_0^1 F(x) \, dx=0 }[/math]

or

[math]\displaystyle{ \int_1^2 F(x) \, dx=0 }[/math]

Alternatively, Ramanujan's sum can be used:

[math]\displaystyle{ \sum_{x \ge 1}^{\Re}f(x)=-f(0)-F(0) }[/math]

or at 1

[math]\displaystyle{ \sum_{x \ge 1}^{\Re}f(x)=-F(1) }[/math]

respectively[6][7]

Summation by parts

Main page: Summation by parts

Indefinite summation by parts:

[math]\displaystyle{ \sum_x f(x)\Delta g(x)=f(x)g(x)-\sum_x (g(x)+\Delta g(x)) \Delta f(x) }[/math]
[math]\displaystyle{ \sum_x f(x)\Delta g(x)+\sum_x g(x)\Delta f(x)=f(x)g(x)-\sum_x \Delta f(x)\Delta g(x) }[/math]

Definite summation by parts:

[math]\displaystyle{ \sum_{i=a}^b f(i)\Delta g(i)=f(b+1)g(b+1)-f(a)g(a)-\sum_{i=a}^b g(i+1)\Delta f(i) }[/math]

Period rules

If [math]\displaystyle{ T }[/math] is a period of function [math]\displaystyle{ f(x) }[/math] then

[math]\displaystyle{ \sum _x f(Tx)=x f(Tx) + C }[/math]

If [math]\displaystyle{ T }[/math] is an antiperiod of function [math]\displaystyle{ f(x) }[/math], that is [math]\displaystyle{ f(x+T)=-f(x) }[/math] then

[math]\displaystyle{ \sum _x f(Tx)=-\frac12 f(Tx) + C }[/math]

Alternative usage

Some authors use the phrase "indefinite sum" to describe a sum in which the numerical value of the upper limit is not given:

[math]\displaystyle{ \sum_{k=1}^n f(k). }[/math]

In this case a closed form expression F(k) for the sum is a solution of

[math]\displaystyle{ F(x+1) - F(x) = f(x+1) }[/math]

which is called the telescoping equation.[8] It is the inverse of the backward difference [math]\displaystyle{ \nabla }[/math] operator. It is related to the forward antidifference operator using the fundamental theorem of discrete calculus described earlier.

List of indefinite sums

This is a list of indefinite sums of various functions. Not every function has an indefinite sum that can be expressed in terms of elementary functions.

Antidifferences of rational functions

[math]\displaystyle{ \sum _x a = ax + C }[/math]
[math]\displaystyle{ \sum _x x = \frac{x^2}{2}-\frac{x}{2} + C }[/math]
[math]\displaystyle{ \sum _x x^a = \frac{B_{a+1}(x)}{a+1} + C,\,a\notin \mathbb{Z}^- }[/math]
where [math]\displaystyle{ B_a(x)=-a\zeta(-a+1,x) }[/math], the generalized to real order Bernoulli polynomials.
[math]\displaystyle{ \sum _x x^a = \frac{(-1)^{a-1}\psi^{(-a-1)}(x)}{\Gamma(-a)}+ C,\,a\in\mathbb{Z}^- }[/math]
where [math]\displaystyle{ \psi^{(n)}(x) }[/math] is the polygamma function.
[math]\displaystyle{ \sum _x \frac1x = \psi(x) + C }[/math]
where [math]\displaystyle{ \psi(x) }[/math] is the digamma function.
[math]\displaystyle{ \sum _x B_a(x)=(x-1)B_a(x)-\frac{a}{a+1} B_{a+1}(x)+C }[/math]

Antidifferences of exponential functions

[math]\displaystyle{ \sum _x a^x = \frac{a^x}{a-1} + C }[/math]

Particularly,

[math]\displaystyle{ \sum _x 2^x = 2^x + C }[/math]

Antidifferences of logarithmic functions

[math]\displaystyle{ \sum _x \log_b x = \log_b \Gamma (x) + C }[/math]
[math]\displaystyle{ \sum _x \log_b ax = \log_b (a^{x-1}\Gamma (x)) + C }[/math]

Antidifferences of hyperbolic functions

[math]\displaystyle{ \sum _x \sinh ax = \frac{1}{2} \operatorname{csch} \left(\frac{a}{2}\right) \cosh \left(\frac{a}{2} - a x\right) + C }[/math]
[math]\displaystyle{ \sum _x \cosh ax = \frac{1}{2} \operatorname{csch} \left(\frac{a}{2}\right) \sinh \left(ax-\frac{a}{2}\right) + C }[/math]
[math]\displaystyle{ \sum _x \tanh ax = \frac1a \psi _{e^a}\left(x-\frac{i \pi }{2 a}\right)+\frac1a \psi _{e^a}\left(x+\frac{i \pi }{2 a}\right)-x + C }[/math]
where [math]\displaystyle{ \psi_q(x) }[/math] is the q-digamma function.

Antidifferences of trigonometric functions

[math]\displaystyle{ \sum _x \sin ax = -\frac{1}{2} \csc \left(\frac{a}{2}\right) \cos \left(\frac{a}{2}- ax \right) + C \,,\,\,a\ne 2n \pi }[/math]
[math]\displaystyle{ \sum _x \cos ax = \frac{1}{2} \csc \left(\frac{a}{2}\right) \sin \left(ax - \frac{a}{2}\right) + C \,,\,\,a\ne 2n \pi }[/math]
[math]\displaystyle{ \sum _x \sin^2 ax = \frac{x}{2} + \frac{1}{4} \csc (a) \sin (a-2ax) + C \, \,,\,\,a\ne n\pi }[/math]
[math]\displaystyle{ \sum _x \cos^2 ax = \frac{x}{2}-\frac{1}{4} \csc (a) \sin (a-2 a x) + C \,\,,\,\,a\ne n\pi }[/math]
[math]\displaystyle{ \sum_x \tan ax = i x-\frac1a \psi _{e^{2 i a}}\left(x-\frac{\pi }{2 a}\right) + C \,,\,\,a\ne \frac{n\pi}2 }[/math]
where [math]\displaystyle{ \psi_q(x) }[/math] is the q-digamma function.
[math]\displaystyle{ \sum_x \tan x=ix-\psi _{e^{2 i}}\left(x+\frac{\pi }{2}\right) + C = -\sum _{k=1}^{\infty } \left(\psi \left(k \pi -\frac{\pi }{2}+1-x\right)+\psi \left(k \pi -\frac{\pi }{2}+x\right)-\psi \left(k \pi -\frac{\pi }{2}+1\right)-\psi \left(k \pi -\frac{\pi }{2}\right)\right) + C }[/math]
[math]\displaystyle{ \sum_x \cot ax =-i x-\frac{i \psi _{e^{2 i a}}(x)}{a} + C \,,\,\,a\ne \frac{n\pi}2 }[/math]
[math]\displaystyle{ \sum_x \operatorname{sinc} x=\operatorname{sinc}(x-1)\left(\frac{1}{2}+(x-1)\left(\ln(2)+\frac{\psi (\frac{x-1}{2})+\psi (\frac{1-x}{2})}{2}-\frac{\psi (x-1)+\psi (1-x)}{2}\right)\right) + C }[/math]
where [math]\displaystyle{ \operatorname{sinc} (x) }[/math] is the normalized sinc function.

Antidifferences of inverse hyperbolic functions

[math]\displaystyle{ \sum_x \operatorname{artanh}\, a x =\frac{1}{2} \ln \left(\frac{\Gamma \left(x+\frac{1}{a}\right)}{\Gamma \left(x-\frac{1}{a}\right)}\right) + C }[/math]

Antidifferences of inverse trigonometric functions

[math]\displaystyle{ \sum_x \arctan a x = \frac{i}{2} \ln \left(\frac{\Gamma (x+\frac ia)}{ \Gamma (x-\frac ia)}\right)+C }[/math]

Antidifferences of special functions

[math]\displaystyle{ \sum _x \psi(x)=(x-1) \psi(x)-x+C }[/math]
[math]\displaystyle{ \sum _x \Gamma(x)=(-1)^{x+1}\Gamma(x)\frac{\Gamma(1-x,-1)}e+C }[/math]
where [math]\displaystyle{ \Gamma(s,x) }[/math] is the incomplete gamma function.
[math]\displaystyle{ \sum _x (x)_a = \frac{(x)_{a+1}}{a+1}+C }[/math]
where [math]\displaystyle{ (x)_a }[/math] is the falling factorial.
[math]\displaystyle{ \sum _x \operatorname{sexp}_a (x) = \ln_a \frac{(\operatorname{sexp}_a (x))'}{(\ln a)^x} + C }[/math]
(see super-exponential function)

See also

  • Indefinite product
  • Time scale calculus
  • List of derivatives and integrals in alternative calculi

References

  1. On Computing Closed Forms for Indefinite Summations. Yiu-Kwong Man. J. Symbolic Computation (1993), 16, 355-376[yes|permanent dead link|dead link}}]
  2. "If Y is a function whose first difference is the function y, then Y is called an indefinite sum of y and denoted Δ−1y" Introduction to Difference Equations, Samuel Goldberg
  3. "Handbook of discrete and combinatorial mathematics", Kenneth H. Rosen, John G. Michaels, CRC Press, 1999, ISBN:0-8493-0149-1
  4. Bernoulli numbers of the second kind on Mathworld
  5. Markus Müller. How to Add a Non-Integer Number of Terms, and How to Produce Unusual Infinite Summations (note that he uses a slightly alternative definition of fractional sum in his work, i.e. inverse to backwards difference, hence 1 as the lower limit in his formula)
  6. Bruce C. Berndt, Ramanujan's Notebooks , Ramanujan's Theory of Divergent Series, Chapter 6, Springer-Verlag (ed.), (1939), pp. 133–149.
  7. Éric Delabaere, Ramanujan's Summation, Algorithms Seminar 2001–2002, F. Chyzak (ed.), INRIA, (2003), pp. 83–88.
  8. Algorithms for Nonlinear Higher Order Difference Equations, Manuel Kauers

Further reading