Eigenfunction

From HandWiki
Short description: Mathematical function of a linear operator
This solution of the vibrating drum problem is, at any point in time, an eigenfunction of the Laplace operator on a disk.

In mathematics, an eigenfunction of a linear operator D defined on some function space is any non-zero function [math]\displaystyle{ f }[/math] in that space that, when acted upon by D, is only multiplied by some scaling factor called an eigenvalue. As an equation, this condition can be written as [math]\displaystyle{ Df = \lambda f }[/math] for some scalar eigenvalue [math]\displaystyle{ \lambda. }[/math][1][2][3] The solutions to this equation may also be subject to boundary conditions that limit the allowable eigenvalues and eigenfunctions.

An eigenfunction is a type of eigenvector.

Eigenfunctions

In general, an eigenvector of a linear operator D defined on some vector space is a nonzero vector in the domain of D that, when D acts upon it, is simply scaled by some scalar value called an eigenvalue. In the special case where D is defined on a function space, the eigenvectors are referred to as eigenfunctions. That is, a function f is an eigenfunction of D if it satisfies the equation

[math]\displaystyle{ Df = \lambda f, }[/math]

 

 

 

 

(1)

where λ is a scalar.[1][2][3] The solutions to Equation (1) may also be subject to boundary conditions. Because of the boundary conditions, the possible values of λ are generally limited, for example to a discrete set λ1, λ2, … or to a continuous set over some range. The set of all possible eigenvalues of D is sometimes called its spectrum, which may be discrete, continuous, or a combination of both.[1]

Each value of λ corresponds to one or more eigenfunctions. If multiple linearly independent eigenfunctions have the same eigenvalue, the eigenvalue is said to be degenerate and the maximum number of linearly independent eigenfunctions associated with the same eigenvalue is the eigenvalue's degree of degeneracy or geometric multiplicity.[4][5]

Derivative example

A widely used class of linear operators acting on infinite dimensional spaces are differential operators on the space C of infinitely differentiable real or complex functions of a real or complex argument t. For example, consider the derivative operator [math]\displaystyle{ \frac{d}{dt} }[/math] with eigenvalue equation [math]\displaystyle{ \frac{d}{dt}f(t) = \lambda f(t). }[/math]

This differential equation can be solved by multiplying both sides by [math]\displaystyle{ \frac{dt}{f(t)} }[/math] and integrating. Its solution, the exponential function [math]\displaystyle{ f(t)=f_0 e^{\lambda t}, }[/math] is the eigenfunction of the derivative operator, where f0 is a parameter that depends on the boundary conditions. Note that in this case the eigenfunction is itself a function of its associated eigenvalue λ, which can take any real or complex value. In particular, note that for λ = 0 the eigenfunction f(t) is a constant.

Suppose in the example that f(t) is subject to the boundary conditions f(0) = 1 and [math]\displaystyle{ \left.\frac{df}{dt}\right|_{t=0} = 2 }[/math]. We then find that [math]\displaystyle{ f(t)=e^{2t}, }[/math] where λ = 2 is the only eigenvalue of the differential equation that also satisfies the boundary condition.

Link to eigenvalues and eigenvectors of matrices

Eigenfunctions can be expressed as column vectors and linear operators can be expressed as matrices, although they may have infinite dimensions. As a result, many of the concepts related to eigenvectors of matrices carry over to the study of eigenfunctions.

Define the inner product in the function space on which D is defined as [math]\displaystyle{ \langle f,g \rangle = \int_{\Omega} \ f^*(t)g(t) dt, }[/math] integrated over some range of interest for t called Ω. The * denotes the complex conjugate.

Suppose the function space has an orthonormal basis given by the set of functions {u1(t), u2(t), …, un(t)}, where n may be infinite. For the orthonormal basis, [math]\displaystyle{ \langle u_i,u_j \rangle = \int_{\Omega} \ u_i^*(t)u_j(t) dt = \delta_{ij} = \begin{cases} 1 & i=j \\ 0 & i \ne j \end{cases}, }[/math] where δij is the Kronecker delta and can be thought of as the elements of the identity matrix.

Functions can be written as a linear combination of the basis functions, [math]\displaystyle{ f(t) = \sum_{j=1}^n b_j u_j(t), }[/math] for example through a Fourier expansion of f(t). The coefficients bj can be stacked into an n by 1 column vector b = [b1 b2bn]T. In some special cases, such as the coefficients of the Fourier series of a sinusoidal function, this column vector has finite dimension.

Additionally, define a matrix representation of the linear operator D with elements [math]\displaystyle{ A_{ij} = \langle u_i,Du_j \rangle = \int_{\Omega}\ u^*_i(t)Du_j(t) dt. }[/math]

We can write the function Df(t) either as a linear combination of the basis functions or as D acting upon the expansion of f(t), [math]\displaystyle{ Df(t) = \sum_{j=1}^n c_j u_j(t) = \sum_{j=1}^n b_j Du_j(t). }[/math]

Taking the inner product of each side of this equation with an arbitrary basis function ui(t), [math]\displaystyle{ \begin{align} \sum_{j=1}^n c_j \int_{\Omega} \ u_i^*(t)u_j(t) dt &= \sum_{j=1}^n b_j \int_{\Omega} \ u_i^*(t)Du_j(t) dt, \\ c_i &= \sum_{j=1}^n b_j A_{ij}. \end{align} }[/math]

This is the matrix multiplication Ab = c written in summation notation and is a matrix equivalent of the operator D acting upon the function f(t) expressed in the orthonormal basis. If f(t) is an eigenfunction of D with eigenvalue λ, then Ab = λb.

Eigenvalues and eigenfunctions of Hermitian operators

Many of the operators encountered in physics are Hermitian. Suppose the linear operator D acts on a function space that is a Hilbert space with an orthonormal basis given by the set of functions {u1(t), u2(t), …, un(t)}, where n may be infinite. In this basis, the operator D has a matrix representation A with elements [math]\displaystyle{ A_{ij} = \langle u_i,Du_j \rangle = \int_{\Omega} dt\ u^*_i(t)Du_j(t). }[/math] integrated over some range of interest for t denoted Ω.

By analogy with Hermitian matrices, D is a Hermitian operator if Aij = Aji*, or:[6] [math]\displaystyle{ \begin{align} \langle u_i,Du_j \rangle &= \langle Du_i,u_j \rangle, \\[-1pt] \int_{\Omega} dt\ u^*_i(t)Du_j(t) &= \int_{\Omega} dt\ u_j(t)[Du_i(t)]^*. \end{align} }[/math]

Consider the Hermitian operator D with eigenvalues λ1, λ2, … and corresponding eigenfunctions f1(t), f2(t), …. This Hermitian operator has the following properties:

  • Its eigenvalues are real, λi = λi*[4][6]
  • Its eigenfunctions obey an orthogonality condition, [math]\displaystyle{ \langle f_i,f_j \rangle = 0 }[/math] if ij[6][7][8]

The second condition always holds for λiλj. For degenerate eigenfunctions with the same eigenvalue λi, orthogonal eigenfunctions can always be chosen that span the eigenspace associated with λi, for example by using the Gram-Schmidt process.[5] Depending on whether the spectrum is discrete or continuous, the eigenfunctions can be normalized by setting the inner product of the eigenfunctions equal to either a Kronecker delta or a Dirac delta function, respectively.[8][9]

For many Hermitian operators, notably Sturm–Liouville operators, a third property is

  • Its eigenfunctions form a basis of the function space on which the operator is defined[5]

As a consequence, in many important cases, the eigenfunctions of the Hermitian operator form an orthonormal basis. In these cases, an arbitrary function can be expressed as a linear combination of the eigenfunctions of the Hermitian operator.

Applications

Vibrating strings

The shape of a standing wave in a string fixed at its boundaries is an example of an eigenfunction of a differential operator. The admissible eigenvalues are governed by the length of the string and determine the frequency of oscillation.

Let h(x, t) denote the transverse displacement of a stressed elastic chord, such as the vibrating strings of a string instrument, as a function of the position x along the string and of time t. Applying the laws of mechanics to infinitesimal portions of the string, the function h satisfies the partial differential equation [math]\displaystyle{ \frac{\partial^2 h}{\partial t^2} = c^2\frac{\partial^2 h}{\partial x^2}, }[/math] which is called the (one-dimensional) wave equation. Here c is a constant speed that depends on the tension and mass of the string.

This problem is amenable to the method of separation of variables. If we assume that h(x, t) can be written as the product of the form X(x)T(t), we can form a pair of ordinary differential equations: [math]\displaystyle{ \frac{d^2}{dx^2}X=-\frac{\omega^2}{c^2}X, \qquad \frac{d^2}{dt^2}T = -\omega^2 T. }[/math]

Each of these is an eigenvalue equation with eigenvalues [math]\displaystyle{ -\frac{\omega^2}{c^2} }[/math] and ω2, respectively. For any values of ω and c, the equations are satisfied by the functions [math]\displaystyle{ X(x) = \sin\left(\frac{\omega x}{c} + \varphi\right), \qquad T(t) = \sin(\omega t + \psi), }[/math] where the phase angles φ and ψ are arbitrary real constants.

If we impose boundary conditions, for example that the ends of the string are fixed at x = 0 and x = L, namely X(0) = X(L) = 0, and that T(0) = 0, we constrain the eigenvalues. For these boundary conditions, sin(φ) = 0 and sin(ψ) = 0, so the phase angles φ = ψ = 0, and [math]\displaystyle{ \sin\left(\frac{\omega L}{c}\right) = 0. }[/math]

This last boundary condition constrains ω to take a value ωn = ncπ/L, where n is any integer. Thus, the clamped string supports a family of standing waves of the form [math]\displaystyle{ h(x,t) = \sin\left(\frac{n\pi x}{L} \right) \sin(\omega_n t). }[/math]

In the example of a string instrument, the frequency ωn is the frequency of the n-th harmonic, which is called the (n − 1)-th overtone.

Schrödinger equation

In quantum mechanics, the Schrödinger equation [math]\displaystyle{ i \hbar \frac{\partial}{\partial t}\Psi(\mathbf{r},t) = H \Psi(\mathbf{r},t) }[/math] with the Hamiltonian operator [math]\displaystyle{ H = -\frac{\hbar^2}{2m}\nabla^2+ V(\mathbf{r},t) }[/math] can be solved by separation of variables if the Hamiltonian does not depend explicitly on time.[10] In that case, the wave function Ψ(r,t) = φ(r)T(t) leads to the two differential equations,

[math]\displaystyle{ H\varphi(\mathbf{r}) = E\varphi(\mathbf{r}), }[/math]

 

 

 

 

(2)

[math]\displaystyle{ i\hbar \frac{\partial T(t)}{\partial t} = ET(t). }[/math]

 

 

 

 

(3)

Both of these differential equations are eigenvalue equations with eigenvalue E. As shown in an earlier example, the solution of Equation (3) is the exponential [math]\displaystyle{ T(t) = e^{{-iEt}/{\hbar}}. }[/math]

Equation (2) is the time-independent Schrödinger equation. The eigenfunctions φk of the Hamiltonian operator are stationary states of the quantum mechanical system, each with a corresponding energy Ek. They represent allowable energy states of the system and may be constrained by boundary conditions.

The Hamiltonian operator H is an example of a Hermitian operator whose eigenfunctions form an orthonormal basis. When the Hamiltonian does not depend explicitly on time, general solutions of the Schrödinger equation are linear combinations of the stationary states multiplied by the oscillatory T(t),[11] [math]\displaystyle{ \Psi(\mathbf{r},t) = \sum_k c_k \varphi_k(\mathbf{r}) e^{{-iE_kt}/{\hbar}} }[/math] or, for a system with a continuous spectrum, [math]\displaystyle{ \Psi(\mathbf{r},t) = \int dE \, c_E \varphi_E(\mathbf{r}) e^{{-iEt}/{\hbar}}. }[/math]

The success of the Schrödinger equation in explaining the spectral characteristics of hydrogen is considered one of the greatest triumphs of 20th century physics.

Signals and systems

In the study of signals and systems, an eigenfunction of a system is a signal f(t) that, when input into the system, produces a response y(t) = λf(t), where λ is a complex scalar eigenvalue.[12]

See also

Notes

Citations

Works cited

External links