Estimate cross-spectral density from an array using short-time fourier.
The time series data consisting of n_epochs separate observations of signals with n_channels time-series of length n_times.
floatSampling frequency of observations.
floatTime of the first sample relative to the onset of the epoch, in seconds. Defaults to 0.
floatMinimum frequency of interest, in Hertz.
float | numpy.infMaximum frequency of interest, in Hertz.
float | NoneMinimum time instant to consider, in seconds. If None start at
first sample.
float | NoneMaximum time instant to consider, in seconds. If None end at last
sample.
list of str | NoneA name for each time series. If None (the default), the series will
be named ‘SERIES###’.
int | NoneLength of the FFT. If None, the exact number of samples between
tmin and tmax will be used.
list of Projection | NoneList of projectors to store in the CSD object. Defaults to None,
which means no projectors are stored.
int | NoneThe number of jobs to run in parallel. If -1, it is set
to the number of CPU cores. Requires the joblib package.
None (default) is a marker for ‘unset’ that will be interpreted
as n_jobs=1 (sequential execution) unless the call is performed under
a joblib.parallel_backend() context manager that sets another
value for n_jobs.
str | int | NoneControl verbosity of the logging output. If None, use the default
verbosity level. See the logging documentation and
mne.verbose() for details. Should only be passed as a keyword
argument.
CrossSpectralDensityThe computed cross-spectral density.