Spatiotemporal permutation F-test on full sensor data#

Tests for differential evoked responses in at least one condition using a permutation clustering test. The FieldTrip neighbor templates will be used to determine the adjacency between sensors. This serves as a spatial prior to the clustering. Spatiotemporal clusters will then be visualized using custom matplotlib code.

Here, the unit of observation is epochs from a specific study subject. However, the same logic applies when the unit observation is a number of study subject each of whom contribute their own averaged data (i.e., an average of their epochs). This would then be considered an analysis at the “2nd level”.

See the FieldTrip tutorial for a caveat regarding the possible interpretation of “significant” clusters.

For more information on cluster-based permutation testing in MNE-Python, see also: Non-parametric 1 sample cluster statistic on single trial power.

# Authors: Denis Engemann <denis.engemann@gmail.com>
#          Jona Sassenhagen <jona.sassenhagen@gmail.com>
#          Alex Rockhill <aprockhill@mailbox.org>
#          Stefan Appelhoff <stefan.appelhoff@mailbox.org>
#
# License: BSD-3-Clause
# Copyright the MNE-Python contributors.
import matplotlib.pyplot as plt
import numpy as np
import scipy.stats
from mpl_toolkits.axes_grid1 import make_axes_locatable

import mne
from mne.channels import find_ch_adjacency
from mne.datasets import sample
from mne.stats import combine_adjacency, spatio_temporal_cluster_test
from mne.viz import plot_compare_evokeds

Set parameters#

data_path = sample.data_path()
meg_path = data_path / "MEG" / "sample"
raw_fname = meg_path / "sample_audvis_filt-0-40_raw.fif"
event_fname = meg_path / "sample_audvis_filt-0-40_raw-eve.fif"
event_id = {"Aud/L": 1, "Aud/R": 2, "Vis/L": 3, "Vis/R": 4}
tmin = -0.2
tmax = 0.5

# Setup for reading the raw data
raw = mne.io.read_raw_fif(raw_fname, preload=True)
raw.filter(1, 25)
events = mne.read_events(event_fname)
Opening raw data file /home/circleci/mne_data/MNE-sample-data/MEG/sample/sample_audvis_filt-0-40_raw.fif...
    Read a total of 4 projection items:
        PCA-v1 (1 x 102)  idle
        PCA-v2 (1 x 102)  idle
        PCA-v3 (1 x 102)  idle
        Average EEG reference (1 x 60)  idle
    Range : 6450 ... 48149 =     42.956 ...   320.665 secs
Ready.
Reading 0 ... 41699  =      0.000 ...   277.709 secs...
Filtering raw data in 1 contiguous segment
Setting up band-pass filter from 1 - 25 Hz

FIR filter parameters
---------------------
Designing a one-pass, zero-phase, non-causal bandpass filter:
- Windowed time-domain design (firwin) method
- Hamming window with 0.0194 passband ripple and 53 dB stopband attenuation
- Lower passband edge: 1.00
- Lower transition bandwidth: 1.00 Hz (-6 dB cutoff frequency: 0.50 Hz)
- Upper passband edge: 25.00 Hz
- Upper transition bandwidth: 6.25 Hz (-6 dB cutoff frequency: 28.12 Hz)
- Filter length: 497 samples (3.310 s)

[Parallel(n_jobs=1)]: Done  17 tasks      | elapsed:    0.0s
[Parallel(n_jobs=1)]: Done  71 tasks      | elapsed:    0.1s
[Parallel(n_jobs=1)]: Done 161 tasks      | elapsed:    0.2s
[Parallel(n_jobs=1)]: Done 287 tasks      | elapsed:    0.3s

Read epochs for the channel of interest#

picks = mne.pick_types(raw.info, meg="mag", eog=True)

reject = dict(mag=4e-12, eog=150e-6)
epochs = mne.Epochs(
    raw,
    events,
    event_id,
    tmin,
    tmax,
    picks=picks,
    decim=2,  # just for speed!
    baseline=None,
    reject=reject,
    preload=True,
)

epochs.drop_channels(["EOG 061"])
epochs.equalize_event_counts(event_id)

# Obtain the data as a 3D matrix and transpose it such that
# the dimensions are as expected for the cluster permutation test:
# n_epochs × n_times × n_channels
X = [epochs[event_name].get_data(copy=False) for event_name in event_id]
X = [np.transpose(x, (0, 2, 1)) for x in X]
Not setting metadata
288 matching events found
No baseline correction applied
Created an SSP operator (subspace dimension = 3)
3 projection items activated
Using data from preloaded Raw for 288 events and 106 original time points (prior to decimation) ...
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on MAG : ['MEG 1711']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on MAG : ['MEG 1711']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
    Rejecting  epoch based on EOG : ['EOG 061']
48 bad epochs dropped
Dropped 16 epochs: 50, 51, 84, 93, 95, 96, 129, 146, 149, 154, 158, 195, 201, 203, 211, 212

Find the FieldTrip neighbor definition to setup sensor adjacency#

75 cluster ftest spatiotemporal
Reading adjacency matrix for neuromag306mag.
<class 'scipy.sparse._csr.csr_array'>

Compute permutation statistic#

How does it work? We use clustering to “bind” together features which are similar. Our features are the magnetic fields measured over our sensor array at different times. This reduces the multiple comparison problem. To compute the actual test-statistic, we first sum all F-values in all clusters. We end up with one statistic for each cluster. Then we generate a distribution from the data by shuffling our conditions between our samples and recomputing our clusters and the test statistics. We test for the significance of a given cluster by computing the probability of observing a cluster of that size [1][2].

# We are running an F test, so we look at the upper tail
# see also: https://stats.stackexchange.com/a/73993
tail = 1

# We want to set a critical test statistic (here: F), to determine when
# clusters are being formed. Using Scipy's percent point function of the F
# distribution, we can conveniently select a threshold that corresponds to
# some alpha level that we arbitrarily pick.
alpha_cluster_forming = 0.001

# For an F test we need the degrees of freedom for the numerator
# (number of conditions - 1) and the denominator (number of observations
# - number of conditions):
n_conditions = len(event_id)
n_observations = len(X[0])
dfn = n_conditions - 1
dfd = n_observations - n_conditions

# Note: we calculate 1 - alpha_cluster_forming to get the critical value
# on the right tail
f_thresh = scipy.stats.f.ppf(1 - alpha_cluster_forming, dfn=dfn, dfd=dfd)

# run the cluster based permutation analysis
cluster_stats = spatio_temporal_cluster_test(
    X,
    n_permutations=1000,
    threshold=f_thresh,
    tail=tail,
    n_jobs=None,
    buffer_size=None,
    adjacency=adjacency,
)
F_obs, clusters, p_values, _ = cluster_stats
stat_fun(H1): min=0.0033787374718191373 max=207.77535362731717
Running initial clustering …
Found 22 clusters

  0%|          | Permuting : 0/999 [00:00<?,       ?it/s]
  1%|          | Permuting : 8/999 [00:00<00:04,  236.36it/s]
  2%|▏         | Permuting : 18/999 [00:00<00:03,  266.96it/s]
  3%|▎         | Permuting : 28/999 [00:00<00:03,  277.30it/s]
  4%|▍         | Permuting : 38/999 [00:00<00:03,  282.41it/s]
  5%|▍         | Permuting : 47/999 [00:00<00:03,  278.98it/s]
  6%|▌         | Permuting : 58/999 [00:00<00:03,  287.98it/s]
  7%|▋         | Permuting : 68/999 [00:00<00:03,  289.46it/s]
  8%|▊         | Permuting : 79/999 [00:00<00:03,  294.98it/s]
  9%|▉         | Permuting : 89/999 [00:00<00:03,  295.26it/s]
 10%|▉         | Permuting : 99/999 [00:00<00:03,  295.45it/s]
 11%|█         | Permuting : 108/999 [00:00<00:03,  292.21it/s]
 12%|█▏        | Permuting : 119/999 [00:00<00:02,  295.90it/s]
 13%|█▎        | Permuting : 129/999 [00:00<00:02,  296.00it/s]
 14%|█▍        | Permuting : 139/999 [00:00<00:02,  296.09it/s]
 15%|█▍        | Permuting : 149/999 [00:00<00:02,  296.16it/s]
 16%|█▌        | Permuting : 159/999 [00:00<00:02,  296.23it/s]
 17%|█▋        | Permuting : 168/999 [00:00<00:02,  293.74it/s]
 18%|█▊        | Permuting : 178/999 [00:00<00:02,  293.57it/s]
 19%|█▊        | Permuting : 187/999 [00:00<00:02,  291.40it/s]
 20%|█▉        | Permuting : 196/999 [00:00<00:02,  289.51it/s]
 21%|██        | Permuting : 206/999 [00:00<00:02,  290.07it/s]
 22%|██▏       | Permuting : 216/999 [00:00<00:02,  290.59it/s]
 23%|██▎       | Permuting : 226/999 [00:00<00:02,  291.04it/s]
 24%|██▎       | Permuting : 236/999 [00:00<00:02,  291.44it/s]
 25%|██▍       | Permuting : 245/999 [00:00<00:02,  289.72it/s]
 25%|██▌       | Permuting : 254/999 [00:00<00:02,  288.21it/s]
 26%|██▋       | Permuting : 264/999 [00:00<00:02,  288.78it/s]
 27%|██▋       | Permuting : 273/999 [00:00<00:02,  287.35it/s]
 28%|██▊       | Permuting : 283/999 [00:00<00:02,  287.98it/s]
 29%|██▉       | Permuting : 293/999 [00:01<00:02,  288.56it/s]
 30%|███       | Permuting : 303/999 [00:01<00:02,  289.09it/s]
 31%|███▏      | Permuting : 313/999 [00:01<00:02,  289.59it/s]
 32%|███▏      | Permuting : 322/999 [00:01<00:02,  288.09it/s]
 33%|███▎      | Permuting : 330/999 [00:01<00:02,  284.97it/s]
 34%|███▍      | Permuting : 339/999 [00:01<00:02,  283.89it/s]
 35%|███▍      | Permuting : 348/999 [00:01<00:02,  282.88it/s]
 36%|███▌      | Permuting : 357/999 [00:01<00:02,  281.94it/s]
 37%|███▋      | Permuting : 366/999 [00:01<00:02,  281.07it/s]
 38%|███▊      | Permuting : 376/999 [00:01<00:02,  282.00it/s]
 39%|███▊      | Permuting : 386/999 [00:01<00:02,  282.84it/s]
 40%|███▉      | Permuting : 395/999 [00:01<00:02,  281.94it/s]
 40%|████      | Permuting : 404/999 [00:01<00:02,  281.12it/s]
 41%|████▏     | Permuting : 414/999 [00:01<00:02,  282.00it/s]
 42%|████▏     | Permuting : 423/999 [00:01<00:02,  281.13it/s]
 43%|████▎     | Permuting : 432/999 [00:01<00:02,  280.35it/s]
 44%|████▍     | Permuting : 442/999 [00:01<00:01,  281.24it/s]
 45%|████▌     | Permuting : 451/999 [00:01<00:01,  280.47it/s]
 46%|████▌     | Permuting : 460/999 [00:01<00:01,  279.74it/s]
 47%|████▋     | Permuting : 469/999 [00:01<00:01,  279.06it/s]
 48%|████▊     | Permuting : 479/999 [00:01<00:01,  280.03it/s]
 49%|████▉     | Permuting : 489/999 [00:01<00:01,  280.86it/s]
 50%|████▉     | Permuting : 498/999 [00:01<00:01,  280.10it/s]
 51%|█████     | Permuting : 508/999 [00:01<00:01,  280.99it/s]
 52%|█████▏    | Permuting : 518/999 [00:01<00:01,  281.80it/s]
 53%|█████▎    | Permuting : 528/999 [00:01<00:01,  282.58it/s]
 54%|█████▍    | Permuting : 538/999 [00:01<00:01,  283.40it/s]
 55%|█████▍    | Permuting : 548/999 [00:01<00:01,  284.16it/s]
 56%|█████▌    | Permuting : 559/999 [00:01<00:01,  286.45it/s]
 57%|█████▋    | Permuting : 569/999 [00:01<00:01,  287.06it/s]
 58%|█████▊    | Permuting : 579/999 [00:02<00:01,  287.63it/s]
 59%|█████▉    | Permuting : 589/999 [00:02<00:01,  288.18it/s]
 60%|█████▉    | Permuting : 599/999 [00:02<00:01,  288.70it/s]
 61%|██████    | Permuting : 610/999 [00:02<00:01,  290.56it/s]
 62%|██████▏   | Permuting : 619/999 [00:02<00:01,  289.42it/s]
 63%|██████▎   | Permuting : 630/999 [00:02<00:01,  291.41it/s]
 64%|██████▍   | Permuting : 641/999 [00:02<00:01,  293.27it/s]
 65%|██████▌   | Permuting : 652/999 [00:02<00:01,  295.03it/s]
 66%|██████▋   | Permuting : 662/999 [00:02<00:01,  295.19it/s]
 67%|██████▋   | Permuting : 672/999 [00:02<00:01,  295.25it/s]
 68%|██████▊   | Permuting : 682/999 [00:02<00:01,  295.40it/s]
 69%|██████▉   | Permuting : 692/999 [00:02<00:01,  295.54it/s]
 70%|███████   | Permuting : 701/999 [00:02<00:01,  294.13it/s]
 71%|███████▏  | Permuting : 712/999 [00:02<00:00,  295.86it/s]
 72%|███████▏  | Permuting : 721/999 [00:02<00:00,  294.45it/s]
 73%|███████▎  | Permuting : 732/999 [00:02<00:00,  296.15it/s]
 74%|███████▍  | Permuting : 743/999 [00:02<00:00,  297.78it/s]
 75%|███████▌  | Permuting : 753/999 [00:02<00:00,  297.81it/s]
 76%|███████▋  | Permuting : 764/999 [00:02<00:00,  299.33it/s]
 78%|███████▊  | Permuting : 775/999 [00:02<00:00,  300.78it/s]
 79%|███████▊  | Permuting : 786/999 [00:02<00:00,  301.92it/s]
 80%|███████▉  | Permuting : 797/999 [00:02<00:00,  303.20it/s]
 81%|████████  | Permuting : 806/999 [00:02<00:00,  301.41it/s]
 82%|████████▏ | Permuting : 817/999 [00:02<00:00,  302.74it/s]
 83%|████████▎ | Permuting : 827/999 [00:02<00:00,  302.49it/s]
 84%|████████▍ | Permuting : 837/999 [00:02<00:00,  302.27it/s]
 85%|████████▍ | Permuting : 847/999 [00:02<00:00,  302.07it/s]
 86%|████████▌ | Permuting : 857/999 [00:02<00:00,  301.85it/s]
 87%|████████▋ | Permuting : 868/999 [00:02<00:00,  303.16it/s]
 88%|████████▊ | Permuting : 879/999 [00:02<00:00,  304.36it/s]
 89%|████████▉ | Permuting : 889/999 [00:03<00:00,  304.03it/s]
 90%|█████████ | Permuting : 900/999 [00:03<00:00,  305.24it/s]
 91%|█████████ | Permuting : 910/999 [00:03<00:00,  304.85it/s]
 92%|█████████▏| Permuting : 921/999 [00:03<00:00,  306.01it/s]
 93%|█████████▎| Permuting : 932/999 [00:03<00:00,  307.00it/s]
 94%|█████████▍| Permuting : 942/999 [00:03<00:00,  306.54it/s]
 95%|█████████▌| Permuting : 952/999 [00:03<00:00,  306.11it/s]
 96%|█████████▋| Permuting : 962/999 [00:03<00:00,  305.70it/s]
 97%|█████████▋| Permuting : 972/999 [00:03<00:00,  305.31it/s]
 98%|█████████▊| Permuting : 983/999 [00:03<00:00,  306.42it/s]
 99%|█████████▉| Permuting : 994/999 [00:03<00:00,  307.33it/s]
100%|██████████| Permuting : 999/999 [00:03<00:00,  306.67it/s]
100%|██████████| Permuting : 999/999 [00:03<00:00,  295.16it/s]

Note

Note how we only specified an adjacency for sensors! However, because we used mne.stats.spatio_temporal_cluster_test(), an adjacency for time points was automatically taken into account. That is, at time point N, the time points N - 1 and N + 1 were considered as adjacent (this is also called “lattice adjacency”). This is only possible because we ran the analysis on 2D data (times × channels) per observation … for 3D data per observation (e.g., times × frequencies × channels), we will need to use mne.stats.combine_adjacency(), as shown further below.

Note also that the same functions work with source estimates. The only differences are the origin of the data, the size, and the adjacency definition. It can be used for single trials or for groups of subjects.

Visualize clusters#

# We subselect clusters that we consider significant at an arbitrarily
# picked alpha level: "p_accept".
# NOTE: remember the caveats with respect to "significant" clusters that
# we mentioned in the introduction of this tutorial!
p_accept = 0.01
good_cluster_inds = np.where(p_values < p_accept)[0]

# configure variables for visualization
colors = {"Aud": "crimson", "Vis": "steelblue"}
linestyles = {"L": "-", "R": "--"}

# organize data for plotting
evokeds = {cond: epochs[cond].average() for cond in event_id}

# loop over clusters
for i_clu, clu_idx in enumerate(good_cluster_inds):
    # unpack cluster information, get unique indices
    time_inds, space_inds = np.squeeze(clusters[clu_idx])
    ch_inds = np.unique(space_inds)
    time_inds = np.unique(time_inds)

    # get topography for F stat
    f_map = F_obs[time_inds, ...].mean(axis=0)

    # get signals at the sensors contributing to the cluster
    sig_times = epochs.times[time_inds]

    # create spatial mask
    mask = np.zeros((f_map.shape[0], 1), dtype=bool)
    mask[ch_inds, :] = True

    # initialize figure
    fig, ax_topo = plt.subplots(1, 1, figsize=(10, 3), layout="constrained")

    # plot average test statistic and mark significant sensors
    f_evoked = mne.EvokedArray(f_map[:, np.newaxis], epochs.info, tmin=0)
    f_evoked.plot_topomap(
        times=0,
        mask=mask,
        axes=ax_topo,
        cmap="Reds",
        vlim=(np.min, np.max),
        show=False,
        colorbar=False,
        mask_params=dict(markersize=10),
    )
    image = ax_topo.images[0]

    # remove the title that would otherwise say "0.000 s"
    ax_topo.set_title("")

    # create additional axes (for ERF and colorbar)
    divider = make_axes_locatable(ax_topo)

    # add axes for colorbar
    ax_colorbar = divider.append_axes("right", size="5%", pad=0.05)
    plt.colorbar(image, cax=ax_colorbar)
    ax_topo.set_xlabel(
        "Averaged F-map ({:0.3f} - {:0.3f} s)".format(*sig_times[[0, -1]])
    )

    # add new axis for time courses and plot time courses
    ax_signals = divider.append_axes("right", size="300%", pad=1.2)
    title = f"Cluster #{i_clu + 1}, {len(ch_inds)} sensor"
    if len(ch_inds) > 1:
        title += "s (mean)"
    plot_compare_evokeds(
        evokeds,
        title=title,
        picks=ch_inds,
        axes=ax_signals,
        colors=colors,
        linestyles=linestyles,
        show=False,
        split_legend=True,
        truncate_yaxis="auto",
    )

    # plot temporal cluster extent
    ymin, ymax = ax_signals.get_ylim()
    ax_signals.fill_betweenx(
        (ymin, ymax), sig_times[0], sig_times[-1], color="orange", alpha=0.3
    )

plt.show()
  • Cluster #1, 4 sensors (mean) (RMS)
  • Cluster #2, 85 sensors (mean) (RMS)
  • Cluster #3, 7 sensors (mean) (RMS)
  • Cluster #4, 8 sensors (mean) (RMS)
  • Cluster #5, 3 sensors (mean) (RMS)
  • Cluster #6, 34 sensors (mean) (RMS)
  • Cluster #7, 5 sensors (mean) (RMS)
  • Cluster #8, 13 sensors (mean) (RMS)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)
combining channels using RMS (mag channels)

Permutation statistic for time-frequencies#

Let’s do the same thing with the time-frequency decomposition of the data (see Frequency and time-frequency sensor analysis for a tutorial and Time-frequency on simulated data (Multitaper vs. Morlet vs. Stockwell vs. Hilbert) for a comparison of time-frequency methods) to show how cluster permutations can be done on higher-dimensional data.

decim = 4
freqs = np.arange(7, 30, 3)  # define frequencies of interest
n_cycles = freqs / freqs[0]

epochs_power = list()
for condition in [epochs[k] for k in ("Aud/L", "Vis/L")]:
    this_tfr = condition.compute_tfr(
        method="morlet",
        freqs=freqs,
        n_cycles=n_cycles,
        decim=decim,
        average=False,
        return_itc=False,
    )
    this_tfr.apply_baseline(mode="ratio", baseline=(None, 0))
    epochs_power.append(this_tfr.data)

# transpose again to (epochs, frequencies, times, channels)
X = [np.transpose(x, (0, 2, 3, 1)) for x in epochs_power]
[Parallel(n_jobs=1)]: Done  17 tasks      | elapsed:    0.1s
[Parallel(n_jobs=1)]: Done  71 tasks      | elapsed:    0.5s
Applying baseline correction (mode: ratio)
[Parallel(n_jobs=1)]: Done  17 tasks      | elapsed:    0.1s
[Parallel(n_jobs=1)]: Done  71 tasks      | elapsed:    0.5s
Applying baseline correction (mode: ratio)

Remember the note on the adjacency matrix from above: For 3D data, as here, we must use mne.stats.combine_adjacency() to extend the sensor-based adjacency to incorporate the time-frequency plane as well.

Here, the integer inputs are converted into a lattice and combined with the sensor adjacency matrix so that data at similar times and with similar frequencies and at close sensor locations are clustered together.

# our data at each observation is of shape frequencies × times × channels
tfr_adjacency = combine_adjacency(len(freqs), len(this_tfr.times), adjacency)

Now we can run the cluster permutation test, but first we have to set a threshold. This example decimates in time and uses few frequencies so we need to increase the threshold from the default value in order to have differentiated clusters (i.e., so that our algorithm doesn’t just find one large cluster). For a more principled method of setting this parameter, threshold-free cluster enhancement may be used. See Statistical inference for a discussion.

# This time we don't calculate a threshold based on the F distribution.
# We might as well select an arbitrary threshold for cluster forming
tfr_threshold = 15.0

# run cluster based permutation analysis
cluster_stats = spatio_temporal_cluster_test(
    X,
    n_permutations=1000,
    threshold=tfr_threshold,
    tail=1,
    n_jobs=None,
    buffer_size=None,
    adjacency=tfr_adjacency,
)
stat_fun(H1): min=6.752417666025854e-07 max=38.408365559505015
Running initial clustering …
Found 3 clusters

  0%|          | Permuting : 0/999 [00:00<?,       ?it/s]
  0%|          | Permuting : 4/999 [00:00<00:08,  118.29it/s]
  1%|          | Permuting : 9/999 [00:00<00:07,  133.75it/s]
  1%|          | Permuting : 11/999 [00:00<00:09,  102.35it/s]
  1%|▏         | Permuting : 14/999 [00:00<00:09,   98.92it/s]
  2%|▏         | Permuting : 17/999 [00:00<00:10,   93.83it/s]
  2%|▏         | Permuting : 21/999 [00:00<00:09,   98.36it/s]
  3%|▎         | Permuting : 26/999 [00:00<00:09,  106.38it/s]
  3%|▎         | Permuting : 32/999 [00:00<00:08,  116.72it/s]
  4%|▎         | Permuting : 37/999 [00:00<00:07,  120.94it/s]
  4%|▍         | Permuting : 43/999 [00:00<00:07,  127.95it/s]
  5%|▍         | Permuting : 48/999 [00:00<00:07,  130.32it/s]
  6%|▌         | Permuting : 55/999 [00:00<00:06,  138.50it/s]
  7%|▋         | Permuting : 67/999 [00:00<00:05,  160.53it/s]
  7%|▋         | Permuting : 69/999 [00:00<00:08,  111.66it/s]
  7%|▋         | Permuting : 71/999 [00:00<00:08,  107.07it/s]
  8%|▊         | Permuting : 75/999 [00:00<00:08,  107.89it/s]
  8%|▊         | Permuting : 80/999 [00:00<00:08,  110.60it/s]
  8%|▊         | Permuting : 83/999 [00:00<00:08,  107.10it/s]
  9%|▉         | Permuting : 88/999 [00:00<00:08,  109.74it/s]
  9%|▉         | Permuting : 92/999 [00:00<00:08,  108.29it/s]
 10%|▉         | Permuting : 97/999 [00:00<00:08,  110.75it/s]
 10%|█         | Permuting : 102/999 [00:00<00:07,  113.03it/s]
 11%|█         | Permuting : 108/999 [00:00<00:07,  116.93it/s]
 11%|█         | Permuting : 110/999 [00:01<00:09,   97.00it/s]
 11%|█         | Permuting : 112/999 [00:01<00:09,   94.48it/s]
 12%|█▏        | Permuting : 121/999 [00:01<00:08,  102.36it/s]
 12%|█▏        | Permuting : 123/999 [00:01<00:08,   99.50it/s]
 13%|█▎        | Permuting : 132/999 [00:01<00:08,  107.79it/s]
 14%|█▎        | Permuting : 136/999 [00:01<00:08,  106.75it/s]
 14%|█▍        | Permuting : 143/999 [00:01<00:07,  110.88it/s]
 15%|█▍        | Permuting : 148/999 [00:01<00:07,  112.72it/s]
 15%|█▌        | Permuting : 153/999 [00:01<00:07,  114.47it/s]
 16%|█▌        | Permuting : 158/999 [00:01<00:07,  116.14it/s]
 16%|█▋        | Permuting : 163/999 [00:01<00:07,  117.73it/s]
 17%|█▋        | Permuting : 165/999 [00:01<00:07,  114.23it/s]
 17%|█▋        | Permuting : 170/999 [00:01<00:07,  115.91it/s]
 17%|█▋        | Permuting : 173/999 [00:01<00:08,   99.74it/s]
 18%|█▊        | Permuting : 176/999 [00:01<00:08,   98.66it/s]
 18%|█▊        | Permuting : 181/999 [00:01<00:08,  100.15it/s]
 18%|█▊        | Permuting : 184/999 [00:01<00:08,   99.05it/s]
 19%|█▊        | Permuting : 186/999 [00:01<00:08,   97.36it/s]
 19%|█▉        | Permuting : 191/999 [00:01<00:08,   99.58it/s]
 20%|█▉        | Permuting : 195/999 [00:01<00:08,  100.43it/s]
 20%|██        | Permuting : 204/999 [00:01<00:07,  107.02it/s]
 21%|██        | Permuting : 209/999 [00:01<00:07,  108.74it/s]
 21%|██▏       | Permuting : 214/999 [00:01<00:07,  110.50it/s]
 22%|██▏       | Permuting : 217/999 [00:01<00:07,  108.85it/s]
 22%|██▏       | Permuting : 221/999 [00:02<00:07,  108.98it/s]
 23%|██▎       | Permuting : 225/999 [00:02<00:07,  109.43it/s]
 23%|██▎       | Permuting : 227/999 [00:02<00:08,   93.94it/s]
 23%|██▎       | Permuting : 233/999 [00:02<00:07,   97.28it/s]
 24%|██▎       | Permuting : 236/999 [00:02<00:07,   96.38it/s]
 24%|██▍       | Permuting : 238/999 [00:02<00:08,   94.91it/s]
 24%|██▍       | Permuting : 243/999 [00:02<00:07,   97.08it/s]
 25%|██▍       | Permuting : 247/999 [00:02<00:07,   97.26it/s]
 25%|██▌       | Permuting : 251/999 [00:02<00:07,   97.64it/s]
 26%|██▌       | Permuting : 259/999 [00:02<00:07,  102.77it/s]
 27%|██▋       | Permuting : 265/999 [00:02<00:06,  105.24it/s]
 27%|██▋       | Permuting : 272/999 [00:02<00:06,  108.82it/s]
 28%|██▊       | Permuting : 281/999 [00:02<00:06,  115.41it/s]
 28%|██▊       | Permuting : 283/999 [00:02<00:06,  112.36it/s]
 29%|██▉       | Permuting : 290/999 [00:02<00:06,  115.66it/s]
 29%|██▉       | Permuting : 292/999 [00:02<00:07,  100.56it/s]
 29%|██▉       | Permuting : 294/999 [00:02<00:07,   98.45it/s]
 30%|██▉       | Permuting : 298/999 [00:02<00:07,   99.24it/s]
 30%|███       | Permuting : 303/999 [00:02<00:06,  101.15it/s]
 31%|███       | Permuting : 307/999 [00:02<00:06,  101.25it/s]
 31%|███       | Permuting : 311/999 [00:02<00:06,  101.28it/s]
 32%|███▏      | Permuting : 318/999 [00:03<00:06,  104.91it/s]
 32%|███▏      | Permuting : 322/999 [00:03<00:06,  105.47it/s]
 33%|███▎      | Permuting : 327/999 [00:03<00:06,  107.21it/s]
 33%|███▎      | Permuting : 329/999 [00:03<00:06,  104.64it/s]
 33%|███▎      | Permuting : 332/999 [00:03<00:06,  103.38it/s]
 33%|███▎      | Permuting : 333/999 [00:03<00:07,   90.53it/s]
 34%|███▎      | Permuting : 335/999 [00:03<00:07,   88.96it/s]
 34%|███▍      | Permuting : 339/999 [00:03<00:07,   90.09it/s]
 35%|███▍      | Permuting : 345/999 [00:03<00:06,   93.45it/s]
 36%|███▌      | Permuting : 356/999 [00:03<00:06,  102.44it/s]
 36%|███▌      | Permuting : 361/999 [00:03<00:06,  104.25it/s]
 37%|███▋      | Permuting : 373/999 [00:03<00:05,  114.19it/s]
 38%|███▊      | Permuting : 375/999 [00:03<00:05,  111.35it/s]
 38%|███▊      | Permuting : 379/999 [00:03<00:05,  111.66it/s]
 38%|███▊      | Permuting : 383/999 [00:03<00:05,  111.96it/s]
 39%|███▊      | Permuting : 385/999 [00:03<00:05,  109.15it/s]
 39%|███▉      | Permuting : 388/999 [00:03<00:05,  108.34it/s]
 39%|███▉      | Permuting : 391/999 [00:03<00:05,  107.55it/s]
 39%|███▉      | Permuting : 394/999 [00:03<00:05,  106.76it/s]
 40%|███▉      | Permuting : 395/999 [00:03<00:05,  102.87it/s]
 40%|███▉      | Permuting : 399/999 [00:03<00:05,  103.55it/s]
 40%|████      | Permuting : 403/999 [00:03<00:06,   92.13it/s]
 41%|████      | Permuting : 405/999 [00:03<00:06,   90.45it/s]
 41%|████      | Permuting : 411/999 [00:04<00:06,   93.27it/s]
 42%|████▏     | Permuting : 415/999 [00:04<00:06,   94.27it/s]
 42%|████▏     | Permuting : 419/999 [00:04<00:06,   95.23it/s]
 43%|████▎     | Permuting : 425/999 [00:04<00:05,   98.23it/s]
 43%|████▎     | Permuting : 427/999 [00:04<00:05,   96.11it/s]
 43%|████▎     | Permuting : 430/999 [00:04<00:05,   95.60it/s]
 43%|████▎     | Permuting : 433/999 [00:04<00:05,   95.35it/s]
 44%|████▎     | Permuting : 435/999 [00:04<00:06,   93.33it/s]
 44%|████▍     | Permuting : 441/999 [00:04<00:05,   96.21it/s]
 44%|████▍     | Permuting : 444/999 [00:04<00:05,   95.93it/s]
 45%|████▍     | Permuting : 447/999 [00:04<00:05,   95.00it/s]
 45%|████▍     | Permuting : 448/999 [00:04<00:06,   82.24it/s]
 45%|████▌     | Permuting : 450/999 [00:04<00:06,   80.93it/s]
 45%|████▌     | Permuting : 454/999 [00:04<00:06,   82.36it/s]
 46%|████▌     | Permuting : 459/999 [00:04<00:06,   84.89it/s]
 46%|████▋     | Permuting : 463/999 [00:04<00:06,   86.21it/s]
 47%|████▋     | Permuting : 468/999 [00:04<00:05,   88.65it/s]
 47%|████▋     | Permuting : 470/999 [00:04<00:06,   86.98it/s]
 47%|████▋     | Permuting : 473/999 [00:04<00:06,   87.08it/s]
 48%|████▊     | Permuting : 475/999 [00:04<00:06,   85.46it/s]
 48%|████▊     | Permuting : 479/999 [00:04<00:05,   86.81it/s]
 49%|████▉     | Permuting : 488/999 [00:04<00:05,   93.59it/s]
 49%|████▉     | Permuting : 491/999 [00:04<00:05,   93.42it/s]
 49%|████▉     | Permuting : 492/999 [00:05<00:06,   81.85it/s]
 49%|████▉     | Permuting : 494/999 [00:05<00:06,   80.63it/s]
 50%|████▉     | Permuting : 497/999 [00:05<00:06,   80.96it/s]
 50%|█████     | Permuting : 502/999 [00:05<00:05,   83.56it/s]
 51%|█████     | Permuting : 507/999 [00:05<00:05,   86.10it/s]
 51%|█████     | Permuting : 510/999 [00:05<00:05,   85.72it/s]
 51%|█████▏    | Permuting : 514/999 [00:05<00:05,   86.52it/s]
 52%|█████▏    | Permuting : 517/999 [00:05<00:05,   86.11it/s]
 52%|█████▏    | Permuting : 520/999 [00:05<00:05,   86.24it/s]
 52%|█████▏    | Permuting : 524/999 [00:05<00:05,   87.56it/s]
 53%|█████▎    | Permuting : 530/999 [00:05<00:05,   91.24it/s]
 54%|█████▎    | Permuting : 535/999 [00:05<00:04,   92.96it/s]
 54%|█████▍    | Permuting : 537/999 [00:05<00:05,   82.30it/s]
 54%|█████▍    | Permuting : 543/999 [00:05<00:05,   85.90it/s]
 55%|█████▍    | Permuting : 546/999 [00:05<00:05,   85.53it/s]
 55%|█████▍    | Permuting : 548/999 [00:05<00:05,   84.55it/s]
 55%|█████▌    | Permuting : 554/999 [00:05<00:05,   88.08it/s]
 56%|█████▌    | Permuting : 558/999 [00:05<00:04,   89.29it/s]
 56%|█████▋    | Permuting : 563/999 [00:05<00:04,   91.64it/s]
 57%|█████▋    | Permuting : 568/999 [00:05<00:04,   93.92it/s]
 57%|█████▋    | Permuting : 573/999 [00:05<00:04,   96.13it/s]
 58%|█████▊    | Permuting : 580/999 [00:05<00:04,  100.40it/s]
 59%|█████▊    | Permuting : 585/999 [00:05<00:04,  102.39it/s]
 59%|█████▉    | Permuting : 592/999 [00:05<00:03,  106.12it/s]
 60%|█████▉    | Permuting : 596/999 [00:05<00:03,  106.66it/s]
 60%|██████    | Permuting : 600/999 [00:05<00:03,  106.51it/s]
 60%|██████    | Permuting : 604/999 [00:06<00:03,  106.70it/s]
 61%|██████    | Permuting : 608/999 [00:06<00:03,  107.23it/s]
 61%|██████▏   | Permuting : 614/999 [00:06<00:04,   96.23it/s]
 62%|██████▏   | Permuting : 618/999 [00:06<00:03,   97.08it/s]
 62%|██████▏   | Permuting : 623/999 [00:06<00:03,   99.05it/s]
 62%|██████▏   | Permuting : 624/999 [00:06<00:03,   95.78it/s]
 63%|██████▎   | Permuting : 631/999 [00:06<00:03,   99.55it/s]
 63%|██████▎   | Permuting : 633/999 [00:06<00:03,   97.42it/s]
 64%|██████▍   | Permuting : 638/999 [00:06<00:03,   98.97it/s]
 64%|██████▍   | Permuting : 642/999 [00:06<00:03,   99.76it/s]
 65%|██████▍   | Permuting : 646/999 [00:06<00:03,  100.53it/s]
 65%|██████▌   | Permuting : 651/999 [00:06<00:03,  102.48it/s]
 66%|██████▌   | Permuting : 656/999 [00:06<00:03,  104.36it/s]
 66%|██████▌   | Permuting : 660/999 [00:06<00:03,  104.18it/s]
 66%|██████▋   | Permuting : 663/999 [00:06<00:03,  103.41it/s]
 67%|██████▋   | Permuting : 665/999 [00:06<00:03,   91.24it/s]
 67%|██████▋   | Permuting : 667/999 [00:06<00:03,   89.59it/s]
 67%|██████▋   | Permuting : 671/999 [00:06<00:03,   90.72it/s]
 67%|██████▋   | Permuting : 672/999 [00:06<00:03,   87.87it/s]
 68%|██████▊   | Permuting : 675/999 [00:06<00:03,   87.92it/s]
 68%|██████▊   | Permuting : 682/999 [00:06<00:03,   92.34it/s]
 69%|██████▊   | Permuting : 686/999 [00:06<00:03,   93.40it/s]
 69%|██████▉   | Permuting : 690/999 [00:07<00:03,   94.42it/s]
 69%|██████▉   | Permuting : 691/999 [00:07<00:03,   91.26it/s]
 70%|██████▉   | Permuting : 698/999 [00:07<00:03,   95.25it/s]
 70%|███████   | Permuting : 702/999 [00:07<00:03,   95.62it/s]
 71%|███████   | Permuting : 705/999 [00:07<00:03,   94.77it/s]
 71%|███████   | Permuting : 711/999 [00:07<00:03,   86.37it/s]
 72%|███████▏  | Permuting : 717/999 [00:07<00:03,   89.73it/s]
 72%|███████▏  | Permuting : 724/999 [00:07<00:02,   93.85it/s]
 73%|███████▎  | Permuting : 728/999 [00:07<00:02,   94.79it/s]
 73%|███████▎  | Permuting : 732/999 [00:07<00:02,   95.17it/s]
 74%|███████▎  | Permuting : 736/999 [00:07<00:02,   95.74it/s]
 74%|███████▍  | Permuting : 739/999 [00:07<00:02,   94.95it/s]
 75%|███████▍  | Permuting : 747/999 [00:07<00:02,   99.77it/s]
 75%|███████▌  | Permuting : 751/999 [00:07<00:02,  100.20it/s]
 76%|███████▌  | Permuting : 755/999 [00:07<00:02,  100.94it/s]
 76%|███████▌  | Permuting : 761/999 [00:07<00:02,  104.01it/s]
 77%|███████▋  | Permuting : 766/999 [00:07<00:02,  105.80it/s]
 77%|███████▋  | Permuting : 771/999 [00:07<00:02,  107.54it/s]
 77%|███████▋  | Permuting : 773/999 [00:07<00:02,   95.34it/s]
 78%|███████▊  | Permuting : 778/999 [00:07<00:02,   97.33it/s]
 78%|███████▊  | Permuting : 780/999 [00:07<00:02,   95.36it/s]
 79%|███████▊  | Permuting : 786/999 [00:07<00:02,   98.36it/s]
 79%|███████▉  | Permuting : 792/999 [00:08<00:02,  101.13it/s]
 80%|███████▉  | Permuting : 796/999 [00:08<00:01,  101.82it/s]
 80%|████████  | Permuting : 803/999 [00:08<00:01,  106.00it/s]
 81%|████████  | Permuting : 808/999 [00:08<00:01,  107.70it/s]
 81%|████████▏ | Permuting : 813/999 [00:08<00:01,  109.35it/s]
 82%|████████▏ | Permuting : 818/999 [00:08<00:01,  110.95it/s]
 82%|████████▏ | Permuting : 824/999 [00:08<00:01,  113.71it/s]
 83%|████████▎ | Permuting : 832/999 [00:08<00:01,  118.11it/s]
 84%|████████▎ | Permuting : 835/999 [00:08<00:01,  116.19it/s]
 84%|████████▍ | Permuting : 837/999 [00:08<00:01,  113.85it/s]
 84%|████████▍ | Permuting : 841/999 [00:08<00:01,  114.07it/s]
 85%|████████▍ | Permuting : 846/999 [00:08<00:01,  115.53it/s]
 85%|████████▍ | Permuting : 849/999 [00:08<00:01,  102.49it/s]
 85%|████████▌ | Permuting : 851/999 [00:08<00:01,  100.43it/s]
 86%|████████▌ | Permuting : 856/999 [00:08<00:01,  102.31it/s]
 86%|████████▌ | Permuting : 859/999 [00:08<00:01,  100.61it/s]
 86%|████████▌ | Permuting : 861/999 [00:08<00:01,   98.42it/s]
 86%|████████▋ | Permuting : 864/999 [00:08<00:01,   98.06it/s]
 87%|████████▋ | Permuting : 870/999 [00:08<00:01,  101.27it/s]
 87%|████████▋ | Permuting : 872/999 [00:08<00:01,   98.99it/s]
 88%|████████▊ | Permuting : 877/999 [00:08<00:01,  100.33it/s]
 89%|████████▉ | Permuting : 887/999 [00:08<00:01,  108.34it/s]
 89%|████████▉ | Permuting : 892/999 [00:08<00:00,  110.00it/s]
 90%|████████▉ | Permuting : 896/999 [00:08<00:00,  110.37it/s]
 90%|████████▉ | Permuting : 898/999 [00:08<00:00,  107.58it/s]
 90%|█████████ | Permuting : 900/999 [00:09<00:01,   94.47it/s]
 90%|█████████ | Permuting : 904/999 [00:09<00:01,   94.85it/s]
 91%|█████████▏| Permuting : 912/999 [00:09<00:00,  100.32it/s]
 92%|█████████▏| Permuting : 915/999 [00:09<00:00,   99.31it/s]
 92%|█████████▏| Permuting : 918/999 [00:09<00:00,   98.88it/s]
 92%|█████████▏| Permuting : 920/999 [00:09<00:00,   96.76it/s]
 92%|█████████▏| Permuting : 924/999 [00:09<00:00,   97.28it/s]
 93%|█████████▎| Permuting : 928/999 [00:09<00:00,   98.14it/s]
 93%|█████████▎| Permuting : 933/999 [00:09<00:00,  100.18it/s]
 94%|█████████▍| Permuting : 940/999 [00:09<00:00,  103.93it/s]
 94%|█████████▍| Permuting : 943/999 [00:09<00:00,  103.33it/s]
 95%|█████████▍| Permuting : 948/999 [00:09<00:00,  105.20it/s]
 95%|█████████▌| Permuting : 952/999 [00:09<00:00,  105.78it/s]
 95%|█████████▌| Permuting : 953/999 [00:09<00:00,   91.89it/s]
 96%|█████████▌| Permuting : 958/999 [00:09<00:00,   94.05it/s]
 96%|█████████▋| Permuting : 962/999 [00:09<00:00,   95.01it/s]
 96%|█████████▋| Permuting : 963/999 [00:09<00:00,   91.94it/s]
 97%|█████████▋| Permuting : 966/999 [00:09<00:00,   91.83it/s]
 98%|█████████▊| Permuting : 976/999 [00:09<00:00,   99.95it/s]
 98%|█████████▊| Permuting : 979/999 [00:09<00:00,   99.53it/s]
 99%|█████████▉| Permuting : 991/999 [00:09<00:00,  109.91it/s]
 99%|█████████▉| Permuting : 992/999 [00:09<00:00,  106.00it/s]
100%|█████████▉| Permuting : 996/999 [00:09<00:00,  106.54it/s]
100%|██████████| Permuting : 999/999 [00:09<00:00,  100.28it/s]

Finally, we can plot our results. It is difficult to visualize clusters in time-frequency-sensor space; plotting time-frequency spectrograms and plotting topomaps display time-frequency and sensor space respectively but they are difficult to combine. We will plot topomaps with the clustered sensors colored in white adjacent to spectrograms in order to provide a visualization of the results. This is a dimensionally limited view, however. Each sensor has its own significant time-frequencies, but, in order to display a single spectrogram, all the time-frequencies that are significant for any sensor in the cluster are plotted as significant. This is a difficulty inherent to visualizing high-dimensional data and should be taken into consideration when interpreting results.

F_obs, clusters, p_values, _ = cluster_stats
good_cluster_inds = np.where(p_values < p_accept)[0]

for i_clu, clu_idx in enumerate(good_cluster_inds):
    # unpack cluster information, get unique indices
    freq_inds, time_inds, space_inds = clusters[clu_idx]
    ch_inds = np.unique(space_inds)
    time_inds = np.unique(time_inds)
    freq_inds = np.unique(freq_inds)

    # get topography for F stat
    f_map = F_obs[freq_inds].mean(axis=0)
    f_map = f_map[time_inds].mean(axis=0)

    # get signals at the sensors contributing to the cluster
    sig_times = epochs.times[time_inds]

    # initialize figure
    fig, ax_topo = plt.subplots(1, 1, figsize=(10, 3), layout="constrained")

    # create spatial mask
    mask = np.zeros((f_map.shape[0], 1), dtype=bool)
    mask[ch_inds, :] = True

    # plot average test statistic and mark significant sensors
    f_evoked = mne.EvokedArray(f_map[:, np.newaxis], epochs.info, tmin=0)
    f_evoked.plot_topomap(
        times=0,
        mask=mask,
        axes=ax_topo,
        cmap="Reds",
        vlim=(np.min, np.max),
        show=False,
        colorbar=False,
        mask_params=dict(markersize=10),
    )
    image = ax_topo.images[0]

    # create additional axes (for ERF and colorbar)
    divider = make_axes_locatable(ax_topo)

    # add axes for colorbar
    ax_colorbar = divider.append_axes("right", size="5%", pad=0.05)
    plt.colorbar(image, cax=ax_colorbar)
    ax_topo.set_xlabel(
        "Averaged F-map ({:0.3f} - {:0.3f} s)".format(*sig_times[[0, -1]])
    )

    # remove the title that would otherwise say "0.000 s"
    ax_topo.set_title("")

    # add new axis for spectrogram
    ax_spec = divider.append_axes("right", size="300%", pad=1.2)
    title = f"Cluster #{i_clu + 1}, {len(ch_inds)} spectrogram"
    if len(ch_inds) > 1:
        title += " (max over channels)"
    F_obs_plot = F_obs[..., ch_inds].max(axis=-1)
    F_obs_plot_sig = np.zeros(F_obs_plot.shape) * np.nan
    F_obs_plot_sig[tuple(np.meshgrid(freq_inds, time_inds))] = F_obs_plot[
        tuple(np.meshgrid(freq_inds, time_inds))
    ]

    for f_image, cmap in zip([F_obs_plot, F_obs_plot_sig], ["gray", "autumn"]):
        c = ax_spec.imshow(
            f_image,
            cmap=cmap,
            aspect="auto",
            origin="lower",
            extent=[epochs.times[0], epochs.times[-1], freqs[0], freqs[-1]],
        )
    ax_spec.set_xlabel("Time (ms)")
    ax_spec.set_ylabel("Frequency (Hz)")
    ax_spec.set_title(title)

    # add another colorbar
    ax_colorbar2 = divider.append_axes("right", size="5%", pad=0.05)
    plt.colorbar(c, cax=ax_colorbar2)
    ax_colorbar2.set_ylabel("F-stat")

    # clean up viz
plt.show()
  • Cluster #1, 4 spectrogram (max over channels)
  • Cluster #2, 6 spectrogram (max over channels)

Exercises#

  • What is the smallest p-value you can obtain, given the finite number of permutations? You can find the answers in the references [1][2].

References#

Total running time of the script: (0 minutes 22.571 seconds)

Gallery generated by Sphinx-Gallery