entropy.perm_entropy
-
entropy.
perm_entropy
(x, order=3, delay=1, normalize=False)[source] Permutation Entropy.
- Parameters
- xlist or np.array
One-dimensional time series of shape (n_times)
- orderint
Order of permutation entropy. Default is 3.
- delayint
Time delay (lag). Default is 1.
- normalizebool
If True, divide by log2(order!) to normalize the entropy between 0 and 1. Otherwise, return the permutation entropy in bit.
- Returns
- pefloat
Permutation Entropy.
Notes
The permutation entropy is a complexity measure for time-series first introduced by Bandt and Pompe in 2002.
The permutation entropy of a signal \(x\) is defined as:
\[H = -\sum p(\pi)\log_2(\pi)\]where the sum runs over all \(n!\) permutations \(\pi\) of order \(n\). This is the information contained in comparing \(n\) consecutive values of the time series. It is clear that \(0 ≤ H (n) ≤ \log_2(n!)\) where the lower bound is attained for an increasing or decreasing sequence of values, and the upper bound for a completely random system where all \(n!\) possible permutations appear with the same probability.
The embedded matrix \(Y\) is created by:
\[y(i)=[x_i,x_{i+\text{delay}}, ...,x_{i+(\text{order}-1) * \text{delay}}]\]\[Y=[y(1),y(2),...,y(N-(\text{order}-1))*\text{delay})]^T\]References
Bandt, Christoph, and Bernd Pompe. “Permutation entropy: a natural complexity measure for time series.” Physical review letters 88.17 (2002): 174102.
Examples
Permutation entropy with order 2
>>> import numpy as np >>> import entropy as ent >>> import stochastic.processes.noise as sn >>> x = [4, 7, 9, 10, 6, 11, 3] >>> # Return a value in bit between 0 and log2(factorial(order)) >>> print(f"{ent.perm_entropy(x, order=2):.4f}") 0.9183
Normalized permutation entropy with order 3
>>> # Return a value comprised between 0 and 1. >>> print(f"{ent.perm_entropy(x, normalize=True):.4f}") 0.5888
Fractional Gaussian noise with H = 0.5
>>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.5, rng=rng).sample(10000) >>> print(f"{ent.perm_entropy(x, normalize=True):.4f}") 0.9998
Fractional Gaussian noise with H = 0.9
>>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.9, rng=rng).sample(10000) >>> print(f"{ent.perm_entropy(x, normalize=True):.4f}") 0.9926
Fractional Gaussian noise with H = 0.1
>>> rng = np.random.default_rng(seed=42) >>> x = sn.FractionalGaussianNoise(hurst=0.1, rng=rng).sample(10000) >>> print(f"{ent.perm_entropy(x, normalize=True):.4f}") 0.9959
Random
>>> rng = np.random.default_rng(seed=42) >>> print(f"{ent.perm_entropy(rng.random(1000), normalize=True):.4f}") 0.9997
Pure sine wave
>>> x = np.sin(2 * np.pi * 1 * np.arange(3000) / 100) >>> print(f"{ent.perm_entropy(x, normalize=True):.4f}") 0.4463
Linearly-increasing time-series
>>> x = np.arange(1000) >>> print(f"{ent.perm_entropy(x, normalize=True):.4f}") -0.0000