mne.minimum_norm.estimate_snr#

mne.minimum_norm.estimate_snr(evoked, inv, verbose=None)[source]#

Estimate the SNR as a function of time for evoked data.

Parameters:
evokedinstance of Evoked

Evoked instance.

invinstance of InverseOperator

The inverse operator.

verbosebool | str | int | None

Control verbosity of the logging output. If None, use the default verbosity level. See the logging documentation and mne.verbose() for details. Should only be passed as a keyword argument.

Returns:
snrndarray, shape (n_times,)

The SNR estimated from the whitened data (i.e., GFP of whitened data).

snr_estndarray, shape (n_times,)

The SNR estimated using the mismatch between the unregularized solution and the regularized solution.

Notes

snr_est is estimated by using different amounts of inverse regularization and checking the mismatch between predicted and measured whitened data.

In more detail, given our whitened inverse obtained from SVD:

M~=R12VΓUT

The values in the diagonal matrix Γ are expressed in terms of the chosen regularization λ21/SNR2 and singular values λk as:

γk=1λkλk2λk2+λ2

We also know that our predicted data is given by:

x^(t)=Gj^(t)=C12UΠw(t)

And thus our predicted whitened data is just:

w^(t)=UΠw(t)

Where Π is diagonal with entries entries:

λkγk=λk2λk2+λ2

If we use no regularization, note that Π is just the identity matrix. Here we test the squared magnitude of the difference between unregularized solution and regularized solutions, choosing the biggest regularization that achieves a χ2-test significance of 0.001.

New in version 0.9.0.

On this page