mne.decoding.TimeFrequency

class mne.decoding.TimeFrequency(freqs, sfreq=1.0, method='morlet', n_cycles=7.0, time_bandwidth=None, use_fft=True, decim=1, output='complex', n_jobs=1, verbose=None)[source]

Time frequency transformer.

Time-frequency transform of times series along the last axis.

Parameters
freqsarray_like of float, shape (n_freqs,)

The frequencies.

sfreqfloat | int, default 1.0

Sampling frequency of the data.

method‘multitaper’ | ‘morlet’, default ‘morlet’

The time-frequency method. ‘morlet’ convolves a Morlet wavelet. ‘multitaper’ uses Morlet wavelets windowed with multiple DPSS multitapers.

n_cyclesfloat | array of float, default 7.0

Number of cycles in the Morlet wavelet. Fixed number or one per frequency.

time_bandwidthfloat, default None

If None and method=multitaper, will be set to 4.0 (3 tapers). Time x (Full) Bandwidth product. Only applies if method == ‘multitaper’. The number of good tapers (low-bias) is chosen automatically based on this to equal floor(time_bandwidth - 1).

use_fftbool, default True

Use the FFT for convolutions or not.

decimint | slice, default 1

To reduce memory usage, decimation factor after time-frequency decomposition. If int, returns tfr[…, ::decim]. If slice, returns tfr[…, decim].

Note

Decimation may create aliasing artifacts, yet decimation is done after the convolutions.

outputstr, default ‘complex’
  • ‘complex’ : single trial complex.

  • ‘power’ : single trial power.

  • ‘phase’ : single trial phase.

n_jobsint

The number of jobs to run in parallel (default 1). If -1, it is set to the number of CPU cores. Requires the joblib package. The number of epochs to process at the same time. The parallelization is implemented across channels.

verbosebool | str | int | None

Control verbosity of the logging output. If None, use the default verbosity level. See the logging documentation and mne.verbose() for details. Should only be passed as a keyword argument.

Methods

__hash__(/)

Return hash(self).

fit(X[, y])

Do nothing (for scikit-learn compatibility purposes).

fit_transform(X[, y])

Time-frequency transform of times series along the last axis.

get_params([deep])

Get parameters for this estimator.

set_params(**params)

Set the parameters of this estimator.

transform(X)

Time-frequency transform of times series along the last axis.

fit(X, y=None)[source]

Do nothing (for scikit-learn compatibility purposes).

Parameters
Xarray, shape (n_samples, n_channels, n_times)

The training data.

yarray | None

The target values.

Returns
selfobject

Return self.

fit_transform(X, y=None)[source]

Time-frequency transform of times series along the last axis.

Parameters
Xarray, shape (n_samples, n_channels, n_times)

The training data samples. The channel dimension can be zero- or 1-dimensional.

yNone

For scikit-learn compatibility purposes.

Returns
Xtarray, shape (n_samples, n_channels, n_freqs, n_times)

The time-frequency transform of the data, where n_channels can be zero- or 1-dimensional.

get_params(deep=True)[source]

Get parameters for this estimator.

Parameters
deepbool, optional

If True, will return the parameters for this estimator and contained subobjects that are estimators.

Returns
paramsdict

Parameter names mapped to their values.

set_params(**params)[source]

Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form <component>__<parameter> so that it’s possible to update each component of a nested object.

Parameters
**paramsdict

Parameters.

Returns
instinstance

The object.

transform(X)[source]

Time-frequency transform of times series along the last axis.

Parameters
Xarray, shape (n_samples, n_channels, n_times)

The training data samples. The channel dimension can be zero- or 1-dimensional.

Returns
Xtarray, shape (n_samples, n_channels, n_freqs, n_times)

The time-frequency transform of the data, where n_channels can be zero- or 1-dimensional.