This repository has been archived by the owner on Dec 12, 2022. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 10
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #30 from DeLaVlag/master
Latest cuda model generator + example
- Loading branch information
Showing
110 changed files
with
856 additions
and
13,057 deletions.
There are no files selected for viewing
0
dsl_cuda/GPUmemindex.png → dsl/GPUmemindex.png
100644 → 100755
File renamed without changes
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,14 +1,14 @@ | ||
# TVB CUDA model generation using LEMS format | ||
This readme describes the usage of the code generation for models defined in LEMS based XML to Cuda (C) format. | ||
The LEMS format PR has been adopted and altered to match TVB model names. | ||
In LEMSCUDA.py the function "cuda_templating(Model+'_CUDA')" will start the code generation. | ||
It expects a [model+'_CUDA'].xml file to be present in tvb/dsl_cuda/NeuroML/XMLmodels. | ||
The generated file will be placed in tvb/simulator/models. | ||
In LEMSCUDA.py the function "cuda_templating(Model+'_CUDA', 'path/to/XMLmodels')" will start the code generation. | ||
It expects a [model+'_CUDA'].xml file to be present in ['path/to/XMLmodels']. | ||
The generated file will be placed in ['installpath']'/tvb-hpc/dsl/dsl_cuda/CUDAmodels/'. | ||
The produced filename is a lower cased [model].py which contains a class named [model]. | ||
In the directory TVB_testsuite the files to run the models on the GPU can be found. | ||
Execute './runthings cuda Modelname' to start the parameter sweep based simulation. | ||
|
||
.. moduleauthor:: Michiel. A. van der Vlag <[email protected]> | ||
.. moduleauthor:: Marmaduke Woodman <[email protected]> | ||
.. moduleauthor:: Sandra Diaz <[email protected]> | ||
|
||
# The CUDA memory model specification | ||
![](GPUmemindex.png) | ||
|
@@ -29,7 +29,7 @@ Mako templating | |
|
||
# XML LEMS Definitions | ||
Based on http://lems.github.io/LEMS/elements.html but attributes are tuned for TVB CUDA models. | ||
As an example an XML line and its translation to CUDA are given. | ||
As an example an XML line and its translation to CUDA are given below. | ||
|
||
* Constants\ | ||
If domain = 'none' no domain range will be added.\ | ||
|
@@ -180,9 +180,12 @@ for (unsigned int j_node = 0; j_node < n_node; j_node++) | |
``` | ||
# Running | ||
Place model file in directory and execute cuda_templating('modelname') function. Resulting model will be | ||
placed in the CUDA model directory | ||
# TODO | ||
Add CUDA model validation tests. | ||
# Running an example | ||
Place an xml model file in directory used for your XML model storage and execute "cuda_templating(Model+'_CUDA', | ||
'path/to/XMLmodels')" function. The resulting model will be placed in the CUDA model directory. | ||
The directory 'tvb-hpc/dsl/dsl_cuda/example/' holds an example how to run the model generator and the CUDA model | ||
on a GPU. | ||
From this directory, execute './runthings cuda [Modelname]' to start model generation corresponding to an xml file | ||
and a parameters sweep simulation with the produced model file on a CUDA enabled machine. | ||
The cuda parameter indicates a cuda simulation is to be started and the [modelname] paramater is the model | ||
that is the target of simulation. |
0
dsl_cuda/NeuroML/XMLmodels/__init__.py → dsl/__init__.py
100644 → 100755
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
File renamed without changes.
Oops, something went wrong.