mne.simulation.simulate_sparse_stc

mne.simulation.simulate_sparse_stc(src, n_dipoles, times, data_fun=<function <lambda>>, labels=None, random_state=None)

Generate sparse (n_dipoles) sources time courses from data_fun

This function randomly selects n_dipoles vertices in the whole cortex or one single vertex in each label if labels is not None. It uses data_fun to generate waveforms for each vertex.

Parameters:

src : instance of SourceSpaces

The source space.

n_dipoles : int

Number of dipoles to simulate.

times : array

Time array

data_fun : callable

Function to generate the waveforms. The default is a 100 nAm, 10 Hz sinusoid as 1e-7 * np.sin(20 * pi * t). The function should take as input the array of time samples in seconds and return an array of the same length containing the time courses.

labels : None | list of Labels

The labels. The default is None, otherwise its size must be n_dipoles.

random_state : None | int | np.random.RandomState

To specify the random generator state.

Returns:

stc : SourceEstimate

The generated source time courses.

Notes

New in version 0.10.0.