mne.decoding.TimeFrequency

class mne.decoding.TimeFrequency(freqs, sfreq=1.0, method='morlet', n_cycles=7.0, time_bandwidth=None, use_fft=True, decim=1, output='complex', n_jobs=1, verbose=None)[source]

Time frequency transformer.

Time-frequency transform of times series along the last axis.

Parameters
freqsarray_like of float, shape (n_freqs,)

The frequencies.

sfreqfloat | int, default 1.0

Sampling frequency of the data.

method‘multitaper’ | ‘morlet’, default ‘morlet’

The time-frequency method. ‘morlet’ convolves a Morlet wavelet. ‘multitaper’ uses Morlet wavelets windowed with multiple DPSS multitapers.

n_cyclesfloat | array of float, default 7.0

Number of cycles in the Morlet wavelet. Fixed number or one per frequency.

time_bandwidthfloat, default None

If None and method=multitaper, will be set to 4.0 (3 tapers). Time x (Full) Bandwidth product. Only applies if method == ‘multitaper’. The number of good tapers (low-bias) is chosen automatically based on this to equal floor(time_bandwidth - 1).

use_fftbool, default True

Use the FFT for convolutions or not.

decimint | slice, default 1

To reduce memory usage, decimation factor after time-frequency decomposition. If int, returns tfr[…, ::decim]. If slice, returns tfr[…, decim].

Note

Decimation may create aliasing artifacts, yet decimation is done after the convolutions.

outputstr, default ‘complex’
  • ‘complex’ : single trial complex.

  • ‘power’ : single trial power.

  • ‘phase’ : single trial phase.

n_jobsint

The number of jobs to run in parallel (default 1). Requires the joblib package. The number of epochs to process at the same time. The parallelization is implemented across channels.

verbosebool, str, int, or None

If not None, override default verbose level (see mne.verbose() and Logging documentation for more).

Methods

__hash__(self, /)

Return hash(self).

fit(self, X[, y])

Do nothing (for scikit-learn compatibility purposes).

fit_transform(self, X[, y])

Time-frequency transform of times series along the last axis.

get_params(self[, deep])

Get parameters for this estimator.

set_params(self, \*\*params)

Set the parameters of this estimator.

transform(self, X)

Time-frequency transform of times series along the last axis.

__hash__(self, /)

Return hash(self).

fit(self, X, y=None)[source]

Do nothing (for scikit-learn compatibility purposes).

Parameters
Xarray, shape (n_samples, n_channels, n_times)

The training data.

yarray | None

The target values.

Returns
selfobject

Return self.

fit_transform(self, X, y=None)[source]

Time-frequency transform of times series along the last axis.

Parameters
Xarray, shape (n_samples, n_channels, n_times)

The training data samples. The channel dimension can be zero- or 1-dimensional.

yNone

For scikit-learn compatibility purposes.

Returns
Xtarray, shape (n_samples, n_channels, n_freqs, n_times)

The time-frequency transform of the data, where n_channels can be zero- or 1-dimensional.

get_params(self, deep=True)[source]

Get parameters for this estimator.

Parameters
deepbool, optional

If True, will return the parameters for this estimator and contained subobjects that are estimators.

Returns
paramsmapping of str to any

Parameter names mapped to their values.

set_params(self, **params)[source]

Set the parameters of this estimator. The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object. Returns ——- self

transform(self, X)[source]

Time-frequency transform of times series along the last axis.

Parameters
Xarray, shape (n_samples, n_channels, n_times)

The training data samples. The channel dimension can be zero- or 1-dimensional.

Returns
Xtarray, shape (n_samples, n_channels, n_freqs, n_times)

The time-frequency transform of the data, where n_channels can be zero- or 1-dimensional.