antropy.sample_entropy¶
- antropy.sample_entropy(x, order=2, metric='chebyshev')[source]¶
Sample Entropy.
- Parameters
- xlist or np.array
One-dimensional time series of shape (n_times).
- orderint
Embedding dimension. Default is 2.
- metricstr
Name of the distance metric function used with
sklearn.neighbors.KDTree
. Default is to use the Chebyshev distance.
- Returns
- sefloat
Sample Entropy.
Notes
Sample entropy is a modification of approximate entropy, used for assessing the complexity of physiological time-series signals. It has two advantages over approximate entropy: data length independence and a relatively trouble-free implementation. Large values indicate high complexity whereas smaller values characterize more self-similar and regular signals.
The sample entropy of a signal \(x\) is defined as:
\[H(x, m, r) = -\log\frac{C(m + 1, r)}{C(m, r)}\]where \(m\) is the embedding dimension (= order), \(r\) is the radius of the neighbourhood (default = \(0.2 * \text{std}(x)\)), \(C(m + 1, r)\) is the number of embedded vectors of length \(m + 1\) having a Chebyshev distance inferior to \(r\) and \(C(m, r)\) is the number of embedded vectors of length \(m\) having a Chebyshev distance inferior to \(r\).
Note that if
metric == 'chebyshev'
andlen(x) < 5000
points, then the sample entropy is computed using a fast custom Numba script. For other distance metric or longer time-series, the sample entropy is computed using a code from the mne-features package by Jean-Baptiste Schiratti and Alexandre Gramfort (requires sklearn).References
Richman, J. S. et al. (2000). Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology-Heart and Circulatory Physiology, 278(6), H2039-H2049.
https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.DistanceMetric.html
Examples
Fractional Gaussian noise with H = 0.5
>>> import numpy as np >>> import antropy as ant >>> import stochastic.processes.noise as sn >>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.5, rng=rng).sample(10000) >>> print(f"{ant.sample_entropy(x, order=2):.4f}") 2.1819
Same with order = 3 and using the Euclidean distance
>>> print(f"{ant.sample_entropy(x, order=3, metric='euclidean'):.4f}") 2.6806
Fractional Gaussian noise with H = 0.9
>>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.9, rng=rng).sample(10000) >>> print(f"{ant.sample_entropy(x):.4f}") 1.9078
Fractional Gaussian noise with H = 0.1
>>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.1, rng=rng).sample(10000) >>> print(f"{ant.sample_entropy(x):.4f}") 2.0555
Random
>>> rng = np.random.default_rng(seed=42) >>> print(f"{ant.sample_entropy(rng.random(1000)):.4f}") 2.2017
Pure sine wave
>>> x = np.sin(2 * np.pi * 1 * np.arange(3000) / 100) >>> print(f"{ant.sample_entropy(x):.4f}") 0.1633
Linearly-increasing time-series
>>> x = np.arange(1000) >>> print(f"{ant.sample_entropy(x):.4f}") -0.0000