This page describes some of the major medium- to long-term goals for MNE-Python. These are goals that require substantial effort and/or API design considerations. Some of these may be suitable for Google Summer of Code projects, while others require more extensive work.


Time-frequency visualization#

We should implement a viewer for interactive visualization of volumetric source-time-frequency (5-D) maps on MRI slices (orthogonal 2D viewer). NutmegTrip (written by Sarang Dalal) provides similar functionality in Matlab in conjunction with FieldTrip. Example of NutmegTrip’s source-time-frequency mode in action (click for link to YouTube):

Clustering statistics API#

The current clustering statistics code has limited functionality. It should be re-worked to create a new cluster_based_statistic or similar function. In particular, the new API should:

  1. Support mixed within- and between-subjects designs, different statistical functions, etc. This should be done via a design argument that mirrors patsy.dmatrices() or similar community standard (e.g., this is what is used by statsmodels.regression.linear_model.OLS).

  2. Have clear tutorials showing how different contrasts can be done (toy data).

  3. Have clear tutorials showing some common analyses on real data (time-freq, sensor space, source space, etc.)

  4. Not introduce any significant speed penalty (e.g., < 10% slower) compared to the existing, more specialized/limited functions.

More details are in #4859.

Access to open EEG/MEG databases#

We should improve the access to open EEG/MEG databases via the mne.datasets module, in other words improve our dataset fetchers. We have physionet, but much more. Having a consistent API to access multiple data sources would be great. See #2852 and #3585 for some ideas, as well as:

  • OpenNEURO

    “A free and open platform for sharing MRI, MEG, EEG, iEEG, and ECoG data.” See for example #6687.

  • Human Connectome Project Datasets

    Over a 3-year span (2012-2015), the Human Connectome Project (HCP) scanned 1,200 healthy adult subjects. The available data includes MR structural scans, behavioral data and (on a subset of the data) resting state and/or task MEG data.

  • MMN dataset

    Used for tutorial/publications applying DCM for ERP analysis using SPM.

  • Kymata datasets

    Current and archived EMEG measurement data, used to test hypotheses in the Kymata atlas. The participants are healthy human adults listening to the radio and/or watching films, and the data is comprised of (averaged) EEG and MEG sensor data and source current reconstructions.

  • BNCI Horizon

    BCI datasets.

Integrate OpenMEEG via improved Python bindings#

OpenMEEG is a state-of-the art solver for forward modeling in the field of brain imaging with MEG/EEG. It solves numerically partial differential equations (PDE). It is written in C++ with Python bindings written in SWIG. The ambition of the project is to integrate OpenMEEG into MNE offering to MNE the ability to solve more forward problems (cortical mapping, intracranial recordings, etc.). Some software tasks that shall be completed:

  • Cleanup Python bindings (remove useless functions, check memory managements, etc.)

  • Write example scripts for OpenMEEG that automatically generate web pages as for MNE

  • Understand how MNE encodes info about sensors (location, orientation, integration points etc.) and allow OpenMEEG to be used.

  • Help package OpenMEEG for Debian/Ubuntu

  • Help manage the continuous integration system

In progress#

Diversity, Equity, and Inclusion (DEI)#

MNE-Python is committed to recruiting and retaining a diverse pool of contributors, see #8221.

First-class OPM support#

MNE-Python has support for reading some OPM data formats such as FIF, but support is still rudimentary. Support should be added for other manufacturers, and standard (and/or novel) preprocessing routines should be added to deal with coregistration adjustment, forward modeling, and OPM-specific artifacts.

Deep source modeling#

Existing source modeling and inverse routines are not explicitly designed to deal with deep sources. Advanced algorithms exist from MGH for enhancing deep source localization, and these should be implemented and vetted in MNE-Python.

Better sEEG/ECoG/DBS support#

Some support already exists for iEEG electrodes in MNE-Python thanks in part to standard abstractions. However, iEEG-specific pipeline steps (e.g., electrode localization) and visualizations (e.g., per-shaft topo plots, Time-frequency visualization) are missing. MNE-Python should work with members of the ECoG/sEEG community to work with or build in existing tools, and extend native functionality for depth electrodes.

Time-frequency classes#

Our current codebase implements classes related to TFRs that remain incomplete. We should implement new classes from the ground up that can hold frequency data (Spectrum), cross-spectral data (CrossSpectrum), multitaper estimates (MultitaperSpectrum), and time-varying estimates (Spectrogram). These should work for continuous, epoched, and averaged sensor data, as well as source-space brain data.

See related issues #6290, #7671, #8026, #8724, #9045, and PRs #6609, #6629, #6672, #6673, #8397, and #8892.

Pediatric and clinical MEG pipelines#

MNE-Python is in the process of providing automated analysis of BIDS-compliant datasets, see MNE-BIDS-Pipeline. By incorporating functionality from the mnefun pipeline, which has been used extensively for pediatric data analysis at I-LABS, better support for pediatric and clinical data processing can be achieved. Multiple processing steps (e.g., eSSS), sanity checks (e.g., cHPI quality), and reporting (e.g., SSP joint plots, SNR plots) will be implemented.

Statistics efficiency#

A key technique in functional neuroimaging analysis is clustering brain activity in adjacent regions prior to statistical analysis. An important clustering algorithm — threshold-free cluster enhancement (TFCE) — currently relies on computationally expensive permutations for hypothesis testing. A faster, probabilistic version of TFCE (pTFCE) is available, and we are in the process of implementing this new algorithm.

3D visualization#

Historically we have used Mayavi for 3D visualization, but have faced limitations and challenges with it. We should work to use some other backend (e.g., PyVista) to get major improvements, such as:

  1. Proper notebook support (through ipyvtklink) (complete)

  2. Better interactivity with surface plots (complete)

  3. Time-frequency plotting (complementary to volume-based Time-frequency visualization)

  4. Integration of multiple functions as done in mne_analyze, e.g., simultaneous source estimate viewing, field map viewing, head surface display, etc. These are all currently available in separate functions, but we should be able to combine them in a single plot as well.

The meta-issue for tracking to-do lists for surface plotting is #7162.

Documentation updates#

Our documentation has many minor issues, which can be found under the tag #labels/DOC.


Distributed computing support#

MNE-BIDS-Pipeline has been enhanced with support for cloud computing via Dask and joblib. After configuring Dask to use local or remote distributed computing resources, MNE-BIDS-Pipeline can readily make use of remote workers to parallelize processing across subjects.

2D visualization#

This goal was completed under CZI EOSS2. Some additional enhancements that could also be implemented are listed in #7751.

Tutorial / example overhaul#

This goal was completed under CZI EOSS2. Ongoing documentation needs are listed in Documentation updates.

Cluster computing images#

As part of this goal, we created docker images suitable for cloud computing via MNE-Docker.