Estimate cross-spectral density from an array using Morlet wavelets.
The time series data consisting of n_epochs separate observations of signals with n_channels time-series of length n_times.
float
Sampling frequency of observations.
list
of float
The frequencies of interest, in Hertz.
float
Time of the first sample relative to the onset of the epoch, in seconds. Defaults to 0.
float
| None
Minimum time instant to consider, in seconds. If None
start at
first sample.
float
| None
Maximum time instant to consider, in seconds. If None
end at last
sample.
list
of str
| None
A name for each time series. If None
(the default), the series will
be named ‘SERIES###’.
float
| list
of float
| None
Number of cycles to use when constructing Morlet wavelets. Fixed number or one per frequency. Defaults to 7.
Whether to use FFT-based convolution to compute the wavelet transform. Defaults to True.
int
| slice
To reduce memory usage, decimation factor during time-frequency decomposition. Defaults to 1 (no decimation).
list
of Projection
| None
List of projectors to store in the CSD object. Defaults to None
,
which means the projectors defined in the Epochs object will be copied.
int
| None
The number of jobs to run in parallel. If -1
, it is set
to the number of CPU cores. Requires the joblib
package.
None
(default) is a marker for ‘unset’ that will be interpreted
as n_jobs=1
(sequential execution) unless the call is performed under
a joblib.parallel_backend()
context manager that sets another
value for n_jobs
.
str
| int
| None
Control verbosity of the logging output. If None
, use the default
verbosity level. See the logging documentation and
mne.verbose()
for details. Should only be passed as a keyword
argument.
CrossSpectralDensity
The computed cross-spectral density.