- mne.time_frequency.csd_array_morlet(X, sfreq, frequencies, t0=0, tmin=None, tmax=None, ch_names=None, n_cycles=7, use_fft=True, decim=1, projs=None, n_jobs=None, *, verbose=None)#
Estimate cross-spectral density from an array using Morlet wavelets.
- Xarray_like, shape (n_epochs, n_channels, n_times)
The time series data consisting of n_epochs separate observations of signals with n_channels time-series of length n_times.
Sampling frequency of observations.
The frequencies of interest, in Hertz.
Time of the first sample relative to the onset of the epoch, in seconds. Defaults to 0.
Minimum time instant to consider, in seconds. If
Nonestart at first sample.
Maximum time instant to consider, in seconds. If
Noneend at last sample.
A name for each time series. If
None(the default), the series will be named ‘SERIES###’.
Number of cycles to use when constructing Morlet wavelets. Fixed number or one per frequency. Defaults to 7.
Whether to use FFT-based convolution to compute the wavelet transform. Defaults to True.
To reduce memory usage, decimation factor during time-frequency decomposition. Defaults to 1 (no decimation).
List of projectors to store in the CSD object. Defaults to
None, which means the projectors defined in the Epochs object will be copied.
The number of jobs to run in parallel. If
-1, it is set to the number of CPU cores. Requires the
None(default) is a marker for ‘unset’ that will be interpreted as
n_jobs=1(sequential execution) unless the call is performed under a
joblib.parallel_configcontext manager that sets another value for
- verbosebool |
- csdinstance of
The computed cross-spectral density.
- csdinstance of