Estimate cross-spectral density from an array using a multitaper method.
The time series data consisting of n_epochs separate observations of signals with n_channels time-series of length n_times.
Sampling frequency of observations.
Time of the first sample relative to the onset of the epoch, in seconds. Defaults to 0.
Minimum frequency of interest, in Hertz.
Maximum frequency of interest, in Hertz.
Minimum time instant to consider, in seconds. If
None start at
Maximum time instant to consider, in seconds. If
None end at last
A name for each time series. If
None (the default), the series will
be named ‘SERIES###’.
Length of the FFT. If
None, the exact number of samples between
tmax will be used.
The bandwidth of the multitaper windowing function in Hz.
Use adaptive weights to combine the tapered spectra into PSD.
Only use tapers with more than 90% spectral concentration within bandwidth.
List of projectors to store in the CSD object. Defaults to
which means no projectors are stored.
The number of jobs to run in parallel. If
-1, it is set
to the number of CPU cores. Requires the
None (default) is a marker for ‘unset’ that will be interpreted
n_jobs=1 (sequential execution) unless the call is performed under
joblib.parallel_backend() context manager that sets another
The computed cross-spectral density.