Skip to main content
Ctrl+K
MNE 1.7.1 documentation - Home
  • Install
  • Documentation
  • API Reference
  • Get help
  • Development
  • GitHub
  • Mastodon
  • Twitter
  • Forum
  • Discord
  • Install
  • Documentation
  • API Reference
  • Get help
  • Development
  • GitHub
  • Mastodon
  • Twitter
  • Forum
  • Discord

Section Navigation

  • Tutorials
    • Introductory tutorials
      • Overview of MEG/EEG analysis with MNE-Python
      • Modifying data in-place
      • Parsing events from raw data
      • The Info data structure
      • Working with sensor locations
      • Configuring MNE-Python
      • Getting started with mne.Report
    • Reading data for different recording systems
      • Importing data from MEG devices
      • Importing data from EEG devices
      • Importing data from fNIRS devices
      • Working with CTF data: the Brainstorm auditory dataset
      • Importing Data from Eyetracking devices
    • Working with continuous data
      • The Raw data structure: continuous data
      • Working with events
      • Annotating continuous data
      • Built-in plotting methods for Raw objects
    • Preprocessing
      • Overview of artifact detection
      • Handling bad channels
      • Rejecting bad data spans and breaks
      • Background information on filtering
      • Filtering and resampling data
      • Repairing artifacts with regression
      • Repairing artifacts with ICA
      • Background on projectors and projections
      • Repairing artifacts with SSP
      • Setting the EEG reference
      • Extracting and visualizing subject head movement
      • Signal-space separation (SSS) and Maxwell filtering
      • Preprocessing functional near-infrared spectroscopy (fNIRS) data
      • Preprocessing optically pumped magnetometer (OPM) MEG data
      • Working with eye tracker data in MNE-Python
    • Segmenting continuous data into epochs
      • The Epochs data structure: discontinuous data
      • Regression-based baseline correction
      • Visualizing epoched data
      • Working with Epoch metadata
      • Auto-generating Epochs metadata
      • Exporting Epochs to Pandas DataFrames
      • Divide continuous data into equally-spaced epochs
    • Estimating evoked responses
      • The Evoked data structure: evoked/averaged data
      • Visualizing Evoked data
      • EEG analysis - Event-Related Potentials (ERPs)
      • Plotting whitened data
    • Time-frequency analysis
      • The Spectrum and EpochsSpectrum classes: frequency-domain data
      • Frequency and time-frequency sensor analysis
      • Frequency-tagging: Basic analysis of an SSVEP/vSSR dataset
    • Forward models and source spaces
      • FreeSurfer MRI reconstruction
      • Source alignment and coordinate frames
      • Using an automated approach to coregistration
      • Head model and forward computation
      • EEG forward operator with a template MRI
      • How MNE uses FreeSurfer’s outputs
      • Fixing BEM and head surfaces
      • Computing a covariance matrix
    • Source localization and inverses
      • The SourceEstimate data structure
      • Source localization with equivalent current dipole (ECD) fit
      • Source localization with MNE, dSPM, sLORETA, and eLORETA
      • The role of dipole orientations in distributed source localization
      • Computing various MNE solutions
      • Source reconstruction using an LCMV beamformer
      • Visualize source time courses (stcs)
      • EEG source localization given electrode locations on an MRI
      • Brainstorm Elekta phantom dataset tutorial
      • Brainstorm CTF phantom dataset tutorial
      • 4D Neuroimaging/BTi phantom dataset tutorial
      • KIT phantom dataset tutorial
    • Statistical analysis of sensor data
      • Statistical inference
      • Visualising statistical significance thresholds on EEG data
      • Non-parametric 1 sample cluster statistic on single trial power
      • Non-parametric between conditions cluster statistic on single trial power
      • Mass-univariate twoway repeated measures ANOVA on single trial power
      • Spatiotemporal permutation F-test on full sensor data
    • Statistical analysis of source estimates
      • Permutation t-test on source data with spatio-temporal clustering
      • 2 samples permutation test on source data with spatio-temporal clustering
      • Repeated measures ANOVA on source data with spatio-temporal clustering
    • Machine learning models of neural activity
      • Spectro-temporal receptive field (STRF) estimation on continuous data
      • Decoding (MVPA)
    • Clinical applications
      • Working with sEEG data
      • Working with ECoG data
      • Sleep stage classification from polysomnography (PSG) data
    • Simulation
      • Creating MNE-Python data structures from scratch
      • Corrupt known signal with point spread
      • DICS for power mapping
    • Visualization tutorials
      • Make figures more publication ready
      • Using the event system to link figures
  • Examples
    • Input/Output
      • Getting averaging info from .fif files
      • How to use data in neural ensemble (NEO) format
      • Reading/Writing a noise covariance matrix
      • Reading XDF EEG data
    • Data Simulation
      • Compare simulated and estimated source activity
      • Generate simulated evoked data
      • Generate simulated raw data
      • Simulate raw data using subject anatomy
      • Generate simulated source data
    • Preprocessing
      • Using contralateral referencing for EEG
      • Cortical Signal Suppression (CSS) for removal of cortical signals
      • Define target events based on time lag, plot evoked response
      • Identify EEG Electrodes Bridged by too much Gel
      • Transform EEG data using current source density (CSD)
      • Show EOG artifact timing
      • Reduce EOG artifacts through regression
      • Automated epochs metadata generation with variable time windows
      • Find MEG reference channel artifacts
      • Visualise NIRS artifact correction methods
      • Compare the different ICA algorithms in MNE
      • Interpolate bad channels for MEG/EEG channels
      • Maxwell filter data with movement compensation
      • Annotate movement artifacts and reestimate dev_head_t
      • Annotate muscle artifacts
      • Removing muscle ICA components
      • Plot sensor denoising using oversampled temporal projection
      • Shifting time-scale in evoked data
      • Remap MEG channel types
      • XDAWN Denoising
    • Visualization
      • How to convert 3D electrode positions to a 2D image
      • Plotting with mne.viz.Brain
      • Visualize channel over epochs as an image
      • Plotting EEG sensors on the scalp
      • Plotting topographic arrowmaps of evoked data
      • Plotting topographic maps of evoked data
      • Whitening evoked data with a noise covariance
      • Plotting eye-tracking heatmaps in MNE-Python
      • Plotting sensor layouts of MEG systems
      • Plot the MNE brain and helmet
      • Plotting sensor layouts of EEG systems
      • Plot a cortical parcellation
      • Plot single trial activity, grouped by ROI and sorted by RT
      • Sensitivity map of SSP projections
      • Compare evoked responses for different conditions
      • Plot custom topographies for MEG sensors
      • Cross-hemisphere comparison
    • Time-Frequency Examples
      • Compute a cross-spectral density (CSD) matrix
      • Compute Power Spectral Density of inverse solution from single epochs
      • Compute power and phase lock in label of the source space
      • Compute source power spectral density (PSD) in a label
      • Compute source power spectral density (PSD) of VectorView and OPM data
      • Compute induced power in the source space with dSPM
      • Temporal whitening with AR model
      • Compute and visualize ERDS maps
      • Explore event-related dynamics for specific frequency bands
      • Time-frequency on simulated data (Multitaper vs. Morlet vs. Stockwell vs. Hilbert)
    • Statistics Examples
      • Permutation F-test on sensor data with 1D cluster level
      • FDR correction on T-test on sensor data
      • Regression on continuous data (rER[P/F])
      • Permutation T-test on sensor data
      • Analysing continuous features with binning and regression in sensor space
    • Machine Learning (Decoding, Encoding, and MVPA)
      • Motor imagery decoding from EEG data using the Common Spatial Pattern (CSP)
      • Decoding in time-frequency space using Common Spatial Patterns (CSP)
      • Representational Similarity Analysis
      • Decoding source space data
      • Continuous Target Decoding with SPoC
      • Decoding sensor space data with generalization across time and conditions
      • Analysis of evoked response using ICA and PCA reduction techniques
      • XDAWN Decoding From EEG data
      • Compute effect-matched-spatial filtering (EMS)
      • Linear classifier on sensor data with plot patterns and filters
      • Receptive Field Estimation and Prediction
      • Compute Spectro-Spatial Decomposition (SSD) spatial filters
    • Connectivity Analysis Examples
    • Forward modeling
      • Display sensitivity maps for EEG and MEG sensors
      • Generate a left cerebellum volume source space
      • Use source space morphing
    • Inverse problem and source analysis
      • Compute MNE-dSPM inverse solution on single epochs
      • Compute sLORETA inverse solution on raw data
      • Compute MNE-dSPM inverse solution on evoked data in volume source space
      • Source localization with a custom inverse solver
      • Compute source level time-frequency timecourses using a DICS beamformer
      • Compute source power using DICS beamformer
      • Compute evoked ERS source power using DICS, LCMV beamformer, and dSPM
      • Compute a sparse inverse solution using the Gamma-MAP empirical Bayesian method
      • Extracting time course from source_estimate object
      • Generate a functional label from source estimates
      • Extracting the time series of activations in a label
      • Compute sparse inverse solution with mixed norm: MxNE and irMxNE
      • Compute MNE inverse solution on evoked data with a mixed source space
      • Compute source power estimate by projecting the covariance with MNE
      • Morph surface source estimate
      • Morph volumetric source estimate
      • Computing source timecourses with an XFit-like multi-dipole model
      • Compute iterative reweighted TF-MxNE with multiscale time-frequency dictionary
      • Visualize source leakage among labels using a circular graph
      • Plot point-spread functions (PSFs) and cross-talk functions (CTFs)
      • Compute cross-talk functions for LCMV beamformers
      • Plot point-spread functions (PSFs) for a volume
      • Compute Rap-Music on evoked data
      • Reading an inverse operator
      • Reading an STC file
      • Compute spatial resolution metrics in source space
      • Compute spatial resolution metrics to compare MEG with EEG+MEG
      • Estimate data SNR using an inverse
      • Computing source space SNR
      • Compute MxNE with time-frequency sparse prior
      • Compute Trap-Music on evoked data
      • Plotting the full vector-valued MNE solution
    • Examples on open datasets
      • Brainstorm raw (median nerve) dataset
      • HF-SEF dataset
      • Kernel OPM phantom data
      • Single trial linear regression analysis with the LIMO dataset
      • Optically pumped magnetometer (OPM) data
      • From raw data to dSPM on SPM Faces dataset
  • Glossary
  • Implementation details
  • Design philosophy
  • Example datasets
  • Command-line tools
  • Migrating from other analysis software
  • The typical M/EEG workflow
  • How to cite MNE-Python
  • Papers citing MNE-Python

Note

Go to the end to download the full example code.

Using the event system to link figures#

Many of MNE-Python’s figures are interactive. For example, you can select channels or scroll through time. The event system allows you to link figures together so that interacting with one figure will simultaneously update another figure.

In this example, we’ll be looking at linking a topomap plot with a source estimate plot, such that selecting the time in one will also update the time in the other, as well as hooking our own custom plot into MNE-Python’s event system.

Since the figures on our website don’t have any interaction capabilities, this example will only work properly when run in an interactive environment.

# Author: Marijn van Vliet <w.m.vanvliet@gmail.com>
#
# License: BSD-3-Clause
# Copyright the MNE-Python contributors.
import matplotlib.pyplot as plt

import mne
from mne.viz.ui_events import TimeChange, link, publish, subscribe

# Turn on interactivity
plt.ion()

Linking interactive plots#

We load sensor-level and source-level data for the MNE-Sample dataset and create two plots that have sliders controlling the time-point that is shown. By default, both figures are independent, but we will link the event channels of the figures together, so that moving the slider in one figure will also move the slider in the other.

data_path = mne.datasets.sample.data_path()
evokeds_fname = data_path / "MEG" / "sample" / "sample_audvis-ave.fif"
evokeds = mne.read_evokeds(evokeds_fname)
for ev in evokeds:
    ev.apply_baseline()
avg_evokeds = mne.combine_evoked(evokeds, "nave")
fig1 = avg_evokeds.plot_topomap("interactive")

stc_fname = data_path / "MEG" / "sample" / "sample_audvis-meg-eeg"
stc = mne.read_source_estimate(stc_fname)
fig2 = stc.plot("sample", subjects_dir=data_path / "subjects")

link(fig1, fig2)  # link the event channels
0.090 s, fT20 ui events
Reading /home/circleci/mne_data/MNE-sample-data/MEG/sample/sample_audvis-ave.fif ...
    Read a total of 4 projection items:
        PCA-v1 (1 x 102) active
        PCA-v2 (1 x 102) active
        PCA-v3 (1 x 102) active
        Average EEG reference (1 x 60) active
    Found the data of interest:
        t =    -199.80 ...     499.49 ms (Left Auditory)
        0 CTF compensation matrices available
        nave = 55 - aspect type = 100
Projections have already been applied. Setting proj attribute to True.
No baseline correction applied
    Read a total of 4 projection items:
        PCA-v1 (1 x 102) active
        PCA-v2 (1 x 102) active
        PCA-v3 (1 x 102) active
        Average EEG reference (1 x 60) active
    Found the data of interest:
        t =    -199.80 ...     499.49 ms (Right Auditory)
        0 CTF compensation matrices available
        nave = 61 - aspect type = 100
Projections have already been applied. Setting proj attribute to True.
No baseline correction applied
    Read a total of 4 projection items:
        PCA-v1 (1 x 102) active
        PCA-v2 (1 x 102) active
        PCA-v3 (1 x 102) active
        Average EEG reference (1 x 60) active
    Found the data of interest:
        t =    -199.80 ...     499.49 ms (Left visual)
        0 CTF compensation matrices available
        nave = 67 - aspect type = 100
Projections have already been applied. Setting proj attribute to True.
No baseline correction applied
    Read a total of 4 projection items:
        PCA-v1 (1 x 102) active
        PCA-v2 (1 x 102) active
        PCA-v3 (1 x 102) active
        Average EEG reference (1 x 60) active
    Found the data of interest:
        t =    -199.80 ...     499.49 ms (Right visual)
        0 CTF compensation matrices available
        nave = 58 - aspect type = 100
Projections have already been applied. Setting proj attribute to True.
No baseline correction applied
Applying baseline correction (mode: mean)
Applying baseline correction (mode: mean)
Applying baseline correction (mode: mean)
Applying baseline correction (mode: mean)
Using control points [ 5.17909658  6.18448887 18.83197989]

Overlaying one figure over another#

A common scenario in which the UI event system comes in handy is when plotting multiple things in the same figure. For example, if we want to draw the magnetic fieldlines on top of a source estimate, we can link the UI event channels together, so that when a different time is selected, both the source estimate and the fieldlines are updated together.

fig_brain = stc.plot("sample", subjects_dir=data_path / "subjects", surface="white")
fig_brain.show_view(distance=400)  # zoom out a little

field_map = mne.make_field_map(
    avg_evokeds,
    trans=data_path / "MEG" / "sample" / "sample_audvis_raw-trans.fif",
    subject="sample",
    subjects_dir=data_path / "subjects",
)
fig_field = mne.viz.plot_evoked_field(
    avg_evokeds,
    field_map,
    alpha=0.2,
    fig=fig_brain,  # plot inside the existing source estimate figure
    time_label=None,  # there is already a time label in the figure
)

link(fig_brain, fig_field)
fig_brain.set_time(0.1)  # updates both source estimate and field lines
20 ui events
Using control points [ 5.17909658  6.18448887 18.83197989]
Using surface from /home/circleci/mne_data/MNE-sample-data/subjects/sample/bem/sample-5120-5120-5120-bem.fif.
Getting helmet for system 306m
Prepare EEG mapping...
Computing dot products for 59 electrodes...
Computing dot products for 2562 surface locations...
Field mapping data ready
    Preparing the mapping matrix...
    Truncating at 21/59 components to omit less than 0.001 (0.00097)
    The map has an average electrode reference (2562 channels)
Prepare MEG mapping...
Computing dot products for 305 coils...
Computing dot products for 304 surface locations...
Field mapping data ready
    Preparing the mapping matrix...
    Truncating at 210/305 components to omit less than 0.0001 (9.9e-05)

Hooking a custom plot into the event system#

In MNE-Python, each figure has an associated event channel. Drawing routines can publish events on the channel and receive events by subscribe-ing to the channel. When subscribing to an event on a channel, you specify a callback function to be called whenever a drawing routine publishes that event on the event channel.

The events are modeled after matplotlib’s event system. Each event has a string name (the snake-case version of its class name) and a list of relevant values. For example, the “time_change” event should have the new time as a value. Values can be any python object. When publishing an event, the publisher creates a new instance of the event’s class. When subscribing to an event, having to dig up and import the correct class is a bit of a hassle. Following matplotlib’s example, subscribers use the string name of the event to refer to it.

Below, we create a custom plot and then make it publish and subscribe to TimeChange events so it can work together with the plots we created earlier.

# Recreate the earlier plots
fig3 = avg_evokeds.plot_topomap("interactive")
fig4 = stc.plot("sample", subjects_dir=data_path / "subjects")

# Create a custom plot
fig5, ax = plt.subplots()
ax.plot(avg_evokeds.times, avg_evokeds.pick("mag").data.max(axis=0))
time_bar = ax.axvline(0, color="black")  # Our time slider
ax.set_xlabel("Time (s)")
ax.set_ylabel("Maximum magnetic field strength")
ax.set_title("A custom plot")


def on_motion_notify(mpl_event):
    """Respond to matplotlib's mouse event.

    Publishes an MNE-Python TimeChange event. When the mouse goes out of bounds, the
    xdata will be None, which is a special case that needs to be handled.
    """
    if mpl_event.xdata is not None:
        publish(fig5, TimeChange(time=mpl_event.xdata))


def on_time_change(event):
    """Respond to MNE-Python's TimeChange event. Updates the plot."""
    time_bar.set_xdata([event.time])
    fig5.canvas.draw()  # update the figure


# Setup the events for the curstom plot. Moving the mouse will trigger a
# matplotlib event, which we will respond to by publishing an MNE-Python UI
# event. Upon receiving a UI event, we will move the vertical line.
plt.connect("motion_notify_event", on_motion_notify)
subscribe(fig5, "time_change", on_time_change)

# Link all the figures together.
link(fig3, fig4, fig5)

# Method calls like this also emit the appropriate UI event.
fig4.set_time(0.1)
  • 0.100 s, fT
  • A custom plot
20 ui events
Using control points [ 5.17909658  6.18448887 18.83197989]

Total running time of the script: (0 minutes 19.391 seconds)

Estimated memory usage: 27 MB

Download Jupyter notebook: 20_ui_events.ipynb

Download Python source code: 20_ui_events.py

Download zipped: 20_ui_events.zip

Gallery generated by Sphinx-Gallery

previous

Make figures more publication ready

next

Examples Gallery

On this page
  • Linking interactive plots
  • Overlaying one figure over another
  • Hooking a custom plot into the event system

© Copyright 2012–2024, MNE Developers. Last updated 2024-07-02 04:32 UTC

Built with the PyData Sphinx Theme 0.15.2.