Detrended fluctuation analysis

From HandWiki

In stochastic processes, chaos theory and time series analysis, detrended fluctuation analysis (DFA) is a method for determining the statistical self-affinity of a signal. It is useful for analysing time series that appear to be long-memory processes (diverging correlation time, e.g. power-law decaying autocorrelation function) or 1/f noise. The obtained exponent is similar to the Hurst exponent, except that DFA may also be applied to signals whose underlying statistics (such as mean and variance) or dynamics are non-stationary (changing with time). It is related to measures based upon spectral techniques such as autocorrelation and Fourier transform.

Peng et al. introduced DFA in 1994 in a paper that has been cited over 3,000 times as of 2022[1] and represents an extension of the (ordinary) fluctuation analysis (FA), which is affected by non-stationarities.

Definition

DFA on a Brownian motion process, with increasing values of [math]\displaystyle{ n }[/math].

Algorithm

Given: a time series [math]\displaystyle{ x_1, x_2, ..., x_N }[/math].

Compute its average value [math]\displaystyle{ \langle x\rangle = \frac 1N \sum_{t=1}^N x_t }[/math].

Sum it into a process [math]\displaystyle{ X_t=\sum_{i=1}^t (x_i-\langle x\rangle) }[/math]. This is the cumulative sum, or profile, of the original time series. For example, the profile of an i.i.d. white noise is a standard random walk.

Select a set [math]\displaystyle{ T = \{n_1, ..., n_k\} }[/math] of integers, such that [math]\displaystyle{ n_1 \lt n_2 \lt \cdots \lt n_k }[/math], the smallest [math]\displaystyle{ n_1 \approx 4 }[/math], the largest [math]\displaystyle{ n_k \approx N }[/math], and the sequence is roughly distributed evenly in log-scale: [math]\displaystyle{ \log(n_2) - \log(n_1) \approx \log(n_3) - \log(n_2) \approx \cdots }[/math]. In other words, it is approximately a geometric progression.[2]

For each [math]\displaystyle{ n \in T }[/math], divide the sequence [math]\displaystyle{ X_t }[/math] into consecutive segments of length [math]\displaystyle{ n }[/math]. Within each segment, compute the least squares straight-line fit (the local trend). Let [math]\displaystyle{ Y_{1,n}, Y_{2,n}, ..., Y_{N,n} }[/math] be the resulting piecewise-linear fit.

Compute the root-mean-square deviation from the local trend (local fluctuation):[math]\displaystyle{ F( n, i) = \sqrt{\frac{1}{n}\sum_{t = in+1}^{in+n} \left( X_t - Y_{t, n} \right)^2}. }[/math]And their root-mean-square is the total fluctuation:

[math]\displaystyle{ F( n ) = \sqrt{\frac{1}{N/n}\sum_{i = 1}^{N/n} F(n, i)^2}. }[/math]

(If [math]\displaystyle{ N }[/math] is not divisible by [math]\displaystyle{ n }[/math], then one can either discard the remainder of the sequence, or repeat the procedure on the reversed sequence, then take their root-mean-square.[3])

Make the log-log plot [math]\displaystyle{ \log n - \log F(n) }[/math].[4][5]

Interpretation

A straight line of slope [math]\displaystyle{ \alpha }[/math] on the log-log plot indicates a statistical self-affinity of form [math]\displaystyle{ F(n) \propto n^{\alpha} }[/math]. Since [math]\displaystyle{ F(n) }[/math] monotonically increases with [math]\displaystyle{ n }[/math], we always have [math]\displaystyle{ \alpha \gt 0 }[/math].

The scaling exponent [math]\displaystyle{ \alpha }[/math] is a generalization of the Hurst exponent, with the precise value giving information about the series self-correlations:

  • [math]\displaystyle{ \alpha\lt 1/2 }[/math]: anti-correlated
  • [math]\displaystyle{ \alpha \simeq 1/2 }[/math]: uncorrelated, white noise
  • [math]\displaystyle{ \alpha\gt 1/2 }[/math]: correlated
  • [math]\displaystyle{ \alpha\simeq 1 }[/math]: 1/f-noise, pink noise
  • [math]\displaystyle{ \alpha\gt 1 }[/math]: non-stationary, unbounded
  • [math]\displaystyle{ \alpha\simeq 3/2 }[/math]: Brownian noise

Because the expected displacement in an uncorrelated random walk of length N grows like [math]\displaystyle{ \sqrt{N} }[/math], an exponent of [math]\displaystyle{ \tfrac{1}{2} }[/math] would correspond to uncorrelated white noise. When the exponent is between 0 and 1, the result is fractional Gaussian noise.

Pitfalls in interpretation

Though the DFA algorithm always produces a positive number [math]\displaystyle{ \alpha }[/math] for any time series, it does not necessarily imply that the time series is self-similar. Self-similarity requires the log-log graph to be sufficiently linear over a wide range of [math]\displaystyle{ n }[/math]. Furthermore, a combination of techniques including MLE, rather than least-squares has been shown to better approximate the scaling, or power-law, exponent.[6]

Also, there are many scaling exponent-like quantities that can be measured for a self-similar time series, including the divider dimension and Hurst exponent. Therefore, the DFA scaling exponent [math]\displaystyle{ \alpha }[/math] is not a fractal dimension, and does not have certain desirable properties that the Hausdorff dimension has, though in certain special cases it is related to the box-counting dimension for the graph of a time series.

Generalizations

Generalization to polynomial trends (higher order DFA)

The standard DFA algorithm given above removes a linear trend in each segment. If we remove a degree-n polynomial trend in each segment, it is called DFAn, or higher order DFA.[7]

Since [math]\displaystyle{ X_t }[/math] is a cumulative sum of [math]\displaystyle{ x_t-\langle x\rangle }[/math], a linear trend in [math]\displaystyle{ X_t }[/math] is a constant trend in [math]\displaystyle{ x_t-\langle x\rangle }[/math], which is a constant trend in [math]\displaystyle{ x_t }[/math] (visible as short sections of "flat plateaus"). In this regard, DFA1 removes the mean from segments of the time series [math]\displaystyle{ x_t }[/math] before quantifying the fluctuation.

Similarly, a degree n trend in [math]\displaystyle{ X_t }[/math] is a degree (n-1) trend in [math]\displaystyle{ x_t }[/math]. For example, DFA1 removes linear trends from segments of the time series [math]\displaystyle{ x_t }[/math] before quantifying the fluctuation, DFA1 removes parabolic trends from [math]\displaystyle{ x_t }[/math], and so on.

The Hurst R/S analysis removes constant trends in the original sequence and thus, in its detrending it is equivalent to DFA1.

Generalization to different moments (multifractal DFA)

DFA can be generalized by computing[math]\displaystyle{ F_q( n ) = \left(\frac{1}{N/n}\sum_{i = 1}^{N/n} F(n, i)^q\right)^{1/q}. }[/math]then making the log-log plot of [math]\displaystyle{ \log n - \log F_q(n) }[/math], If there is a strong linearity in the plot of [math]\displaystyle{ \log n - \log F_q(n) }[/math], then that slope is [math]\displaystyle{ \alpha(q) }[/math].[8] DFA is the special case where [math]\displaystyle{ q=2 }[/math].

Multifractal systems scale as a function [math]\displaystyle{ F_q(n) \propto n^{\alpha(q)} }[/math]. Essentially, the scaling exponents need not be independent of the scale of the system. In particular, DFA measures the scaling-behavior of the second moment-fluctuations.

Kantelhardt et al. intended this scaling exponent as a generalization of the classical Hurst exponent. The classical Hurst exponent corresponds to [math]\displaystyle{ H=\alpha(2) }[/math] for stationary cases, and [math]\displaystyle{ H=\alpha(2)-1 }[/math] for nonstationary cases.[8][9][10]

Applications

The DFA method has been applied to many systems, e.g. DNA sequences,[11][12] neuronal oscillations,[10] speech pathology detection,[13] heartbeat fluctuation in different sleep stages,[14] and animal behavior pattern analysis.[15]

The effect of trends on DFA has been studied.[16]

Relations to other methods, for specific types of signal

For signals with power-law-decaying autocorrelation

In the case of power-law decaying auto-correlations, the correlation function decays with an exponent [math]\displaystyle{ \gamma }[/math]: [math]\displaystyle{ C(L)\sim L^{-\gamma}\!\ }[/math]. In addition the power spectrum decays as [math]\displaystyle{ P(f)\sim f^{-\beta}\!\ }[/math]. The three exponents are related by:[11]

  • [math]\displaystyle{ \gamma=2-2\alpha }[/math]
  • [math]\displaystyle{ \beta=2\alpha-1 }[/math] and
  • [math]\displaystyle{ \gamma=1-\beta }[/math].

The relations can be derived using the Wiener–Khinchin theorem. The relation of DFA to the power spectrum method has been well studied.[17]

Thus, [math]\displaystyle{ \alpha }[/math] is tied to the slope of the power spectrum [math]\displaystyle{ \beta }[/math] and is used to describe the color of noise by this relationship: [math]\displaystyle{ \alpha = (\beta+1)/2 }[/math].

For fractional Gaussian noise

For fractional Gaussian noise (FGN), we have [math]\displaystyle{ \beta \in [-1,1] }[/math], and thus [math]\displaystyle{ \alpha \in [0,1] }[/math], and [math]\displaystyle{ \beta = 2H-1 }[/math], where [math]\displaystyle{ H }[/math] is the Hurst exponent. [math]\displaystyle{ \alpha }[/math] for FGN is equal to [math]\displaystyle{ H }[/math].[18]

For fractional Brownian motion

For fractional Brownian motion (FBM), we have [math]\displaystyle{ \beta \in [1,3] }[/math], and thus [math]\displaystyle{ \alpha \in [1,2] }[/math], and [math]\displaystyle{ \beta = 2H+1 }[/math], where [math]\displaystyle{ H }[/math] is the Hurst exponent. [math]\displaystyle{ \alpha }[/math] for FBM is equal to [math]\displaystyle{ H+1 }[/math].[9] In this context, FBM is the cumulative sum or the integral of FGN, thus, the exponents of their power spectra differ by 2.

See also

References

  1. Peng, C.K. (1994). "Mosaic organization of DNA nucleotides". Phys. Rev. E 49 (2): 1685–1689. doi:10.1103/physreve.49.1685. PMID 9961383. Bibcode1994PhRvE..49.1685P. 
  2. Hardstone, Richard; Poil, Simon-Shlomo; Schiavone, Giuseppina; Jansen, Rick; Nikulin, Vadim; Mansvelder, Huibert; Linkenkaer-Hansen, Klaus (2012). "Detrended Fluctuation Analysis: A Scale-Free View on Neuronal Oscillations". Frontiers in Physiology 3: 450. doi:10.3389/fphys.2012.00450. ISSN 1664-042X. PMID 23226132. 
  3. Zhou, Yu; Leung, Yee (2010-06-21). "Multifractal temporally weighted detrended fluctuation analysis and its application in the analysis of scaling behavior in temperature series". Journal of Statistical Mechanics: Theory and Experiment 2010 (6): P06021. doi:10.1088/1742-5468/2010/06/P06021. ISSN 1742-5468. https://iopscience.iop.org/article/10.1088/1742-5468/2010/06/P06021. 
  4. Peng, C.K. (1994). "Quantification of scaling exponents and crossover phenomena in nonstationary heartbeat time series". Chaos 49 (1): 82–87. doi:10.1063/1.166141. PMID 11538314. Bibcode1995Chaos...5...82P. 
  5. Bryce, R.M.; Sprague, K.B. (2012). "Revisiting detrended fluctuation analysis". Sci. Rep. 2: 315. doi:10.1038/srep00315. PMID 22419991. Bibcode2012NatSR...2E.315B. 
  6. Clauset, Aaron; Rohilla Shalizi, Cosma; Newman, M. E. J. (2009). "Power-Law Distributions in Empirical Data". SIAM Review 51 (4): 661–703. doi:10.1137/070710111. Bibcode2009SIAMR..51..661C. 
  7. Kantelhardt J.W. (2001). "Detecting long-range correlations with detrended fluctuation analysis". Physica A 295 (3–4): 441–454. doi:10.1016/s0378-4371(01)00144-3. Bibcode2001PhyA..295..441K. 
  8. 8.0 8.1 H.E. Stanley, J.W. Kantelhardt; S.A. Zschiegner; E. Koscielny-Bunde; S. Havlin; A. Bunde (2002). "Multifractal detrended fluctuation analysis of nonstationary time series". Physica A 316 (1–4): 87–114. doi:10.1016/s0378-4371(02)01383-3. Bibcode2002PhyA..316...87K. http://havlin.biu.ac.il/Publications.php?keyword=Multifractal+detrended+fluctuation+analysis+of+nonstationary+time+series++&year=*&match=all. Retrieved 2011-07-20. 
  9. 9.0 9.1 Movahed, M. Sadegh (2006). "Multifractal detrended fluctuation analysis of sunspot time series". Journal of Statistical Mechanics: Theory and Experiment 02. 
  10. 10.0 10.1 Hardstone, Richard; Poil, Simon-Shlomo; Schiavone, Giuseppina; Jansen, Rick; Nikulin, Vadim V.; Mansvelder, Huibert D.; Linkenkaer-Hansen, Klaus (1 January 2012). "Detrended Fluctuation Analysis: A Scale-Free View on Neuronal Oscillations". Frontiers in Physiology 3: 450. doi:10.3389/fphys.2012.00450. PMID 23226132. 
  11. 11.0 11.1 Buldyrev (1995). "Long-Range Correlation-Properties of Coding And Noncoding Dna-Sequences- Genbank Analysis". Phys. Rev. E 51 (5): 5084–5091. doi:10.1103/physreve.51.5084. PMID 9963221. Bibcode1995PhRvE..51.5084B. 
  12. Bunde A, Havlin S (1996). Fractals and Disordered Systems, Springer, Berlin, Heidelberg, New York. 
  13. Little, M.; McSharry, P.; Moroz, I.; Roberts, S. (2006). "Nonlinear, Biophysically-Informed Speech Pathology Detection". 2006 IEEE International Conference on Acoustics Speed and Signal Processing Proceedings. 2. pp. II-1080-II-1083. doi:10.1109/ICASSP.2006.1660534. ISBN 1-4244-0469-X. http://www.robots.ox.ac.uk/~sjrob/Pubs/NonlinearBiophysicalVoiceDisorderDetection.pdf. 
  14. Bunde A. (2000). "Correlated and uncorrelated regions in heart-rate fluctuations during sleep". Phys. Rev. E 85 (17): 3736–3739. doi:10.1103/physrevlett.85.3736. PMID 11030994. Bibcode2000PhRvL..85.3736B. 
  15. Bogachev, Mikhail I.; Lyanova, Asya I.; Sinitca, Aleksandr M.; Pyko, Svetlana A.; Pyko, Nikita S.; Kuzmenko, Alexander V.; Romanov, Sergey A.; Brikova, Olga I. et al. (March 2023). "Understanding the complex interplay of persistent and antipersistent regimes in animal movement trajectories as a prominent characteristic of their behavioral pattern profiles: Towards an automated and robust model based quantification of anxiety test data" (in en). Biomedical Signal Processing and Control 81: 104409. doi:10.1016/j.bspc.2022.104409. https://linkinghub.elsevier.com/retrieve/pii/S1746809422008631. 
  16. Hu, K. (2001). "Effect of trends on detrended fluctuation analysis". Phys. Rev. E 64 (1): 011114. doi:10.1103/physreve.64.011114. PMID 11461232. Bibcode2001PhRvE..64a1114H. 
  17. Heneghan (2000). "Establishing the relation between detrended fluctuation analysis and power spectral density analysis for stochastic processes". Phys. Rev. E 62 (5): 6103–6110. doi:10.1103/physreve.62.6103. PMID 11101940. Bibcode2000PhRvE..62.6103H. 
  18. Taqqu, Murad S. (1995). "Estimators for long-range dependence: an empirical study.". Fractals 3 (4): 785–798. doi:10.1142/S0218348X95000692. 

External links