AntroPy is a Python package for computing entropy and fractal dimension measures of time-series. It is designed for speed (Numba JIT compilation) and ease of use, and works on both 1-D and N-D arrays. Typical use cases include feature extraction from physiological signals (e.g. EEG, ECG, EMG), and signal processing research.
Installation#
AntroPy requires Python 3.10+ and depends on NumPy (≥ 1.22.4), SciPy (≥ 1.8.0), scikit-learn (≥ 1.2.0), and Numba (≥ 0.57).
# pip
pip install antropy
# uv
uv pip install antropy
# conda
conda install -c conda-forge antropy
Development installation#
git clone https://github.com/raphaelvallat/antropy.git
cd antropy
uv pip install --group=test --editable .
pytest --verbose
Functions#
Entropy#
Function |
Description |
|---|---|
|
Permutation entropy — captures ordinal patterns in the signal. |
|
Spectral (power-spectrum) entropy via FFT or Welch method. |
|
Singular value decomposition entropy of the time-delay embedding matrix. |
|
Approximate entropy (ApEn) — regularity measure sensitive to the length of the signal. |
|
Sample entropy (SampEn) — less biased alternative to ApEn. |
|
Lempel-Ziv complexity for symbolic / binary sequences. |
|
Number of zero-crossings. |
|
Hjorth mobility and complexity parameters. |
Fractal dimension#
Function |
Description |
|---|---|
|
Petrosian fractal dimension. |
|
Katz fractal dimension. |
|
Higuchi fractal dimension — slope of log curve-length vs log interval. |
|
Detrended fluctuation analysis (DFA) — estimates the Hurst / scaling exponent. |
Quick start#
Entropy measures#
import numpy as np
import antropy as ant
np.random.seed(1234567)
x = np.random.normal(size=3000)
print(ant.perm_entropy(x, normalize=True))
print(ant.spectral_entropy(x, sf=100, method='welch', normalize=True))
print(ant.svd_entropy(x, normalize=True))
print(ant.app_entropy(x))
print(ant.sample_entropy(x))
print(ant.hjorth_params(x)) # mobility in samples⁻¹
print(ant.hjorth_params(x, sf=100)) # mobility in Hz
print(ant.num_zerocross(x))
print(ant.lziv_complexity('01111000011001', normalize=True))
0.9995 # perm_entropy (0 = regular, 1 = random)
0.9941 # spectral_entropy (0 = pure tone, 1 = white noise)
0.9999 # svd_entropy
2.0152 # app_entropy
2.1986 # sample_entropy
(1.4313, 1.2153) # hjorth (mobility, complexity)
(143.1339, 1.2153) # hjorth with sf=100 Hz
1531 # num_zerocross
1.3598 # lziv_complexity (normalized)
Fractal dimension#
print(ant.petrosian_fd(x))
print(ant.katz_fd(x))
print(ant.higuchi_fd(x))
print(ant.detrended_fluctuation(x))
1.0311 # petrosian_fd
5.9543 # katz_fd
2.0037 # higuchi_fd (≈ 2 for white noise)
0.4790 # DFA alpha (≈ 0.5 for white noise)
N-D arrays#
Some functions accept N-D arrays and an axis argument, making it easy to process
multi-channel data in a single call:
import numpy as np
import antropy as ant
# 4 channels × 3000 samples
X = np.random.normal(size=(4, 3000))
pe = ant.perm_entropy(X, normalize=True, axis=-1) # shape (4,)
mob, com = ant.hjorth_params(X, sf=256, axis=-1) # shape (4,) each
nzc = ant.num_zerocross(X, normalize=True, axis=-1) # shape (4,)
se = ant.spectral_entropy(X, sf=256, normalize=True) # shape (4,)
Performance#
Benchmarks on a MacBook Pro M1 Max (2021):
Function |
1 000 samples |
10 000 samples |
Complexity |
|---|---|---|---|
|
24 µs |
87 µs |
O(n) ¹ |
|
141 µs |
863 µs |
O(n log n) ⁴ |
|
35 µs |
140 µs |
O(n·m²) ² |
|
1.5 ms |
45.9 ms |
O(n²) worst ⁵ |
|
917 µs |
46.0 ms |
O(n²) worst ⁵ |
|
241 µs |
25.2 ms |
O(n²/log n) |
|
2.5 µs |
6 µs |
O(n) |
|
19 µs |
44 µs |
O(n) |
|
6 µs |
14 µs |
O(n) |
|
9 µs |
22 µs |
O(n) |
|
7 µs |
92 µs |
O(n·kmax) ³ |
|
99 µs |
1.4 ms |
O(n log n) |
¹ perm_entropy: O(n) for order ∈ {3, 4} (default), O(n·m·log m) for order > 4.
² svd_entropy: m = order (default 3).
³ higuchi_fd: kmax = max interval (default 10).
⁴ spectral_entropy: O(n log n) for FFT method, O(n) for Welch with fixed nperseg (default).
⁵ app_entropy / sample_entropy: O(n²) worst case, empirically ~O(n^1.5) via KDTree average case.
Numba functions (sample_entropy, higuchi_fd, detrended_fluctuation) incur a one-time compilation cost on the first call.
Development#
AntroPy was created and is maintained by Raphael Vallat. Contributions are welcome — feel free to open an issue or submit a pull request on GitHub.
Note: this program is provided with NO WARRANTY OF ANY KIND. Always validate results against known references.
Acknowledgements#
Several functions in AntroPy were adapted from:
MNE-features — Jean-Baptiste Schiratti & Alexandre Gramfort
pyEntropy — Nikolay Donets
pyrem — Quentin Geissmann
nolds — Christopher Scholzel