You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Virtual Brain: An open-source simulator for whole brain network modeling.
There are several modeling studies using brain network models which incorporate biologically realistic macroscopic connectivity (the so-called connectome) to understand the global dynamics observed in the healthy and diseased brain measured by different neuroimaging modalities such as fMRI, EEG and MEG.
For this particular modelling approach in Computational Neuroscience, open source frameworks enabling the collaboration between researchers with different backgrounds are not widely available. The Virtual Brain is, so far, the only neuroinformatics project filling that place.
All projects bellow can be tailored for 12weeks time window, both full time and part-time, as the features/pages can be build incrementally.
Expected results: A set of classes , with demo Jupyter notebook, and unit tests.
Preferred Tech keywords: Python, Zenodo, Jupyter.
Skills level: junior+, mid
Mentors: Lia Domide (lead), Romina Baila (backup)
[2] Structural Connectivity editor widget
Description: In the TVB (https://www.thevirtualbrain.org/) ecosystem there is a new repository called tvb-widgets offering UI widgets for Jupyterlab environments. These widgets are compatible with TVB data formats and able to display them in different forms: either a 2D viewer for time series or a 3D viewer for brain surfaces. The purpose of this project is to implement a new widget which would allow users to edit the connectivity matrices involved in a TVB simulation. Necessary features for this widget: display connectivity matrix, normalize matrix, resect connections, resect nodes, change connection weights, save resulted connectivity. Of course, this new widget has to run in a Jupyterlab notebook as well.
Finally, it would be great to have all the widgets linked into the tvb-ext-xircuits repository which is a Jupyterlab extension based on React JS. At the moment, only the PhasePlaneWidget is linked there, but the rest could be added in a similar manner.
Examples of TVB data formats can be found on Zenodo. Connectivity matrices are available there as well.
Check out our Jupyter notebooks to play with the widgets we have available so far.
Expected results: A set of classes , with demo Jupyter notebook, and unit tests.
Description: In the TVB (https://www.thevirtualbrain.org/) ecosystem there is a toolbox called TVB-Multiscale (https://github.com/the-virtual-brain/tvb-multiscale), which allows for Co-Simulation of TVB with spiking network simulators. The toolbox currently includes interfaces to NEST (https://nest-simulator.readthedocs.io/en/v3.3/) (via pynest interface), ANNarchy (https://annarchy.readthedocs.io/en/latest/) and NEURON ()https://neuron.yale.edu/neuron/ via the NETPYNE (http://www.netpyne.org/) python interface. Co-Simulation means that most of the brain is modeled by TVB using mean field dynamical models (the equation of which describe the average activity of millions of neurons), whereas a few selected brain regions are modeled at a finer scale as spiking neuronal networks. Then, both parts of the model are simulated at the same time and exchange data via necessary transformations between mean field activity and total spiking activity. TVB-Multiscale is already actively used in brain modeling studies (e.g., see the first related publication, modeling virtual Deep Brain Stimulation to a TVB brain, where a basal ganglia network is modeled as a spiking neuronal network in ANNarchy: https://www.sciencedirect.com/science/article/pii/S0014488622001364?via%3Dihub).
Currently, Co-Simulation takes place in a sequential manner, where the integration of each model by the corresponding simulator, as well as the bidirectional transformations and transfer of data, follow one after the other, and are executed by a single process.
The objective of this project is to work towards the next major release (3.x), which will allow for Co-Simulation and data transformation and exchanges to happen in parallel, via multiprocessing. There is already ongoing work (https://github.com/the-virtual-brain/tvb-multiscale/tree/ray), towards a solution that will allow for interactive configuration and execution of the parallel Co-Simulation in Jupyter notebooks, with an API as similar as possible to the API currently used by the sequential Co-Simulation. This solution is using ray (https://www.ray.io/), a software that makes parallel computations easier. Unit and integration tests, as well as documentation, are additional aspects of the work that needs to be done.
Expected results: A set of classes, extending the existing TVB-multiscale classes, and allowing for parallel Co-Simulation, as well as, related examples, tests and documentation, to be included to the next major release of TVB-Multiscale (3.x).
Preferred Tech keywords: Python, Jupyter (optionally multiprocessing, especially ray or MPI)
Skills level: mid, mid+
Mentors: Dionysios Perdikis (lead), Lia Domide (backup), Michael Schirner and Petra Ritter
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
The Virtual Brain: An open-source simulator for whole brain network modeling.
There are several modeling studies using brain network models which incorporate biologically realistic macroscopic connectivity (the so-called connectome) to understand the global dynamics observed in the healthy and diseased brain measured by different neuroimaging modalities such as fMRI, EEG and MEG.
For this particular modelling approach in Computational Neuroscience, open source frameworks enabling the collaboration between researchers with different backgrounds are not widely available. The Virtual Brain is, so far, the only neuroinformatics project filling that place.
All projects bellow can be tailored for 12weeks time window, both full time and part-time, as the features/pages can be build incrementally.
[1] Integrate TVB with Zenodo
Description: TVB (https://www.thevirtualbrain.org/) has a demo dataset currently published on Zenodo (https://zenodo.org/record/4263723#.Ydi-dX1Bzx4).
We use it by manually downloading it, unzip then use inside tvb code and web GUI.
We intend to use Zenodo API instead.
The task is mainly for part-time, if only the above feature will be used, but it can be extended if needed with other external datasources for full time applicants.
More details here: https://req.thevirtualbrain.org/browse/TVB-2607 or here #634
Expected results: A set of classes , with demo Jupyter notebook, and unit tests.
Preferred Tech keywords: Python, Zenodo, Jupyter.
Skills level: junior+, mid
Mentors: Lia Domide (lead), Romina Baila (backup)
[2] Structural Connectivity editor widget
Description: In the TVB (https://www.thevirtualbrain.org/) ecosystem there is a new repository called tvb-widgets offering UI widgets for Jupyterlab environments. These widgets are compatible with TVB data formats and able to display them in different forms: either a 2D viewer for time series or a 3D viewer for brain surfaces. The purpose of this project is to implement a new widget which would allow users to edit the connectivity matrices involved in a TVB simulation. Necessary features for this widget: display connectivity matrix, normalize matrix, resect connections, resect nodes, change connection weights, save resulted connectivity. Of course, this new widget has to run in a Jupyterlab notebook as well.
Finally, it would be great to have all the widgets linked into the tvb-ext-xircuits repository which is a Jupyterlab extension based on React JS. At the moment, only the PhasePlaneWidget is linked there, but the rest could be added in a similar manner.
Examples of TVB data formats can be found on Zenodo. Connectivity matrices are available there as well.
Check out our Jupyter notebooks to play with the widgets we have available so far.
Expected results: A set of classes , with demo Jupyter notebook, and unit tests.
Preferred Tech keywords: Python, IPywidgets, React JS, Jupyterlab, Jupyterlab extensions
Skills level: junior+, mid
Mentors: Lia Domide (lead), Romina Baila (backup)
[3] tvb-multiscale
Description: In the TVB (https://www.thevirtualbrain.org/) ecosystem there is a toolbox called TVB-Multiscale (https://github.com/the-virtual-brain/tvb-multiscale), which allows for Co-Simulation of TVB with spiking network simulators. The toolbox currently includes interfaces to NEST (https://nest-simulator.readthedocs.io/en/v3.3/) (via pynest interface), ANNarchy (https://annarchy.readthedocs.io/en/latest/) and NEURON ()https://neuron.yale.edu/neuron/ via the NETPYNE (http://www.netpyne.org/) python interface. Co-Simulation means that most of the brain is modeled by TVB using mean field dynamical models (the equation of which describe the average activity of millions of neurons), whereas a few selected brain regions are modeled at a finer scale as spiking neuronal networks. Then, both parts of the model are simulated at the same time and exchange data via necessary transformations between mean field activity and total spiking activity. TVB-Multiscale is already actively used in brain modeling studies (e.g., see the first related publication, modeling virtual Deep Brain Stimulation to a TVB brain, where a basal ganglia network is modeled as a spiking neuronal network in ANNarchy: https://www.sciencedirect.com/science/article/pii/S0014488622001364?via%3Dihub).
Currently, Co-Simulation takes place in a sequential manner, where the integration of each model by the corresponding simulator, as well as the bidirectional transformations and transfer of data, follow one after the other, and are executed by a single process.
The objective of this project is to work towards the next major release (3.x), which will allow for Co-Simulation and data transformation and exchanges to happen in parallel, via multiprocessing. There is already ongoing work (https://github.com/the-virtual-brain/tvb-multiscale/tree/ray), towards a solution that will allow for interactive configuration and execution of the parallel Co-Simulation in Jupyter notebooks, with an API as similar as possible to the API currently used by the sequential Co-Simulation. This solution is using ray (https://www.ray.io/), a software that makes parallel computations easier. Unit and integration tests, as well as documentation, are additional aspects of the work that needs to be done.
Expected results: A set of classes, extending the existing TVB-multiscale classes, and allowing for parallel Co-Simulation, as well as, related examples, tests and documentation, to be included to the next major release of TVB-Multiscale (3.x).
Preferred Tech keywords: Python, Jupyter (optionally multiprocessing, especially ray or MPI)
Skills level: mid, mid+
Mentors: Dionysios Perdikis (lead), Lia Domide (backup), Michael Schirner and Petra Ritter
Beta Was this translation helpful? Give feedback.
All reactions