Logical depth

From HandWiki
Revision as of 15:53, 6 February 2024 by ScienceGen (talk | contribs) (link)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It differs from Kolmogorov complexity in that it considers the computation time of the algorithm with nearly minimal length, rather than the length of the minimal algorithm. Informally, the logical depth of a string [math]\displaystyle{ x }[/math] to a signifcance level [math]\displaystyle{ s }[/math] is the time required to compute [math]\displaystyle{ x }[/math] by a program no more than [math]\displaystyle{ s }[/math] bits longer than the shortest program that computes [math]\displaystyle{ x }[/math].[1]

Formally, let [math]\displaystyle{ p^* }[/math] be the shortest program that computes a string [math]\displaystyle{ x }[/math] on some universal computer [math]\displaystyle{ U }[/math]. Then the logical depth of [math]\displaystyle{ x }[/math] to the significance level [math]\displaystyle{ s }[/math] is given by [math]\displaystyle{ \text{min}\{T(p): (|p| - |p^*| \lt s) \wedge (U(p) = x)\}, }[/math] where [math]\displaystyle{ T(p) }[/math] is the number of computation steps that [math]\displaystyle{ p }[/math] made on [math]\displaystyle{ U }[/math] to produce [math]\displaystyle{ x }[/math] and halt.

See also

References

  1. Antunes, Luís; Bauwens, Bruno; Souto, André; Teixeira, Andreia (2017-02-01). "Sophistication vs Logical Depth" (in en). Theory of Computing Systems 60 (2): 280–298. doi:10.1007/s00224-016-9672-6. ISSN 1433-0490. https://doi.org/10.1007/s00224-016-9672-6.