Highly optimized tolerance

From HandWiki

In applied mathematics, highly optimized tolerance (HOT) is a method of generating power law behavior in systems by including a global optimization principle. It was developed by Jean M. Carlson and John Doyle in the early 2000s.[1] For some systems that display a characteristic scale, a global optimization term could potentially be added that would then yield power law behavior. It has been used to generate and describe internet-like graphs, forest fire models and may also apply to biological systems.

Example

The following is taken from Sornette's book.

Consider a random variable, [math]\displaystyle{ X }[/math], that takes on values [math]\displaystyle{ x_i }[/math] with probability [math]\displaystyle{ p_i }[/math]. Furthermore, let’s assume for another parameter [math]\displaystyle{ r_i }[/math]

[math]\displaystyle{ x_i = r_i^{ - \beta } }[/math]

for some fixed [math]\displaystyle{ \beta }[/math]. We then want to minimize

[math]\displaystyle{ L = \sum_{i=0}^{N-1} p_i x_i }[/math]

subject to the constraint

[math]\displaystyle{ \sum_{i=0}^{N-1} r_i = \kappa }[/math]

Using Lagrange multipliers, this gives

[math]\displaystyle{ p_i \propto x_i^{ - ( 1 + 1/ \beta) } }[/math]

giving us a power law. The global optimization of minimizing the energy along with the power law dependence between [math]\displaystyle{ x_i }[/math] and [math]\displaystyle{ r_i }[/math] gives us a power law distribution in probability.

See also

References