antropy.app_entropy¶
- antropy.app_entropy(x, order=2, metric='chebyshev')[source]¶
Approximate Entropy.
- Parameters
- xlist or np.array
One-dimensional time series of shape (n_times).
- orderint
Embedding dimension. Default is 2.
- metricstr
Name of the distance metric function used with
sklearn.neighbors.KDTree
. Default is to use the Chebyshev distance.
- Returns
- aefloat
Approximate Entropy.
Notes
Approximate entropy is a technique used to quantify the amount of regularity and the unpredictability of fluctuations over time-series data. Smaller values indicates that the data is more regular and predictable.
The tolerance value (\(r\)) is set to \(0.2 * \text{std}(x)\).
Code adapted from the mne-features package by Jean-Baptiste Schiratti and Alexandre Gramfort.
References
Richman, J. S. et al. (2000). Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology-Heart and Circulatory Physiology, 278(6), H2039-H2049.
https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.DistanceMetric.html
Examples
Fractional Gaussian noise with H = 0.5
>>> import numpy as np >>> import antropy as ant >>> import stochastic.processes.noise as sn >>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.5, rng=rng).sample(10000) >>> print(f"{ant.app_entropy(x, order=2):.4f}") 2.1958
Same with order = 3 and metric = ‘euclidean’
>>> print(f"{ant.app_entropy(x, order=3, metric='euclidean'):.4f}") 1.5120
Fractional Gaussian noise with H = 0.9
>>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.9, rng=rng).sample(10000) >>> print(f"{ant.app_entropy(x):.4f}") 1.9681
Fractional Gaussian noise with H = 0.1
>>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.1, rng=rng).sample(10000) >>> print(f"{ant.app_entropy(x):.4f}") 2.0906
Random
>>> rng = np.random.default_rng(seed=42) >>> print(f"{ant.app_entropy(rng.random(1000)):.4f}") 1.8177
Pure sine wave
>>> x = np.sin(2 * np.pi * 1 * np.arange(3000) / 100) >>> print(f"{ant.app_entropy(x):.4f}") 0.2009
Linearly-increasing time-series
>>> x = np.arange(1000) >>> print(f"{ant.app_entropy(x):.4f}") -0.0010