User Tools

Site Tools


dsl

Domain-Specific Language

With the N2S3 simulator comes a small Domain-Specific Language (DSL), aimed at providing a simplified interface for enthusiasts not yet familiar with the Scala programming language. For now, it only supports basic experiment writing, that is :

* Network definition

  • Creation of neurons
    • Neuron model setting
    • Parameters setting
  • Creation of connections
    • Synapse model setting
    • Parameters setting
  • Observing the Output of the Experiment

* Simulation

  • Run on a training set
  • Run on a test set

The syntax of the DSL aims to be declarative, with as less “noise” as possible, that is scala keywords, special characters, etc.

Using the DSL

First things first, we have to import the classes used by the DSL.

import fr.univ_lille.cristal.emeraude.n2s3.dsl.N2S3SimulationDSL
import fr.univ_lille.cristal.emeraude.n2s3.dsl.N2S3DSLImplicits._
import fr.univ_lille.cristal.emeraude.n2s3.features.io.input.{InputMnist, InputSample2D, N2S3Entry, SampleToSpikeTrainConverter}
import fr.univ_lille.cristal.emeraude.n2s3.core.models.properties.MembranePotentialThreshold
import fr.univ_lille.cristal.emeraude.n2s3.features.builder.connection.types.FullConnection
import fr.univ_lille.cristal.emeraude.n2s3.features.io.input._
import fr.univ_lille.cristal.emeraude.n2s3.models.neurons.LIF
import fr.univ_lille.cristal.emeraude.n2s3.models.synapses.{InhibitorySynapse, QBGParameters, SimplifiedSTDP}
import fr.univ_lille.cristal.emeraude.n2s3.support.N2S3ResourceManager
import fr.univ_lille.cristal.emeraude.n2s3.support.UnitCast._
import squants.electro.ElectricPotentialConversions.ElectricPotentialConversions

The first step of creating a neural network experiment, which still has a bit of scala in it, is to build a N2S3SimulationDSL object, which will act as the interface between the DSL and the scala code of N2S3. It needs to be implicit, as it will implicitly be taken as argument of almost every function of the DSL

implicit val network = N2S3SimulationDSL()

Then, we can start building the network, by specifying the QBG Parameters.

For the QBG Parameters the users can test his experiments with defaultQGBParameters, QGBParametersforMNIST or QGBParametersforAER. Also, the method to define the QBG Parameters are per the choice of User is mentioned in the DSL Documentation Report.

defaultQGBParameters

Now let's provide the network with the Input dataset, the DSL Comes with two predefined function to add input dataset for MNIST and AER data with the methods hasMNISTdata and hasAERdata

network hasInput InputMnist.Entry >> SampleToSpikeTrainConverter[Float, InputSample2D[Float]](0, 23, 150 MilliSecond, 350 MilliSecond) >> N2S3Entry

For defining the input data the user can also use the statement given below, since MNIST and AER are the initial examples, so we have defined special function to import their input, for creating such special functions please refer to DSL Documentation Report

network hasMNISTdata

In the case of AER Data the user can use

network hasInput  InputAER.Entry >> InputAER.Retina(128, 128, separateSign = false) >> N2S3Entry

Now let's describe the input neuron group and give it a name

network hasInputNeuronGroup "input"

And a few neurons.

network hasNeuronGroup "group_1" ofSize 30 ofModel LIF

This line creates a group of leaky integrate-and-fire neurons, containing 30 neurons. Its name, “group_1”, will be used later to access it. For instance, if we want to set the threshold of the neurons in the group, we can write

"group_1" hasParameters (MembranePotentialThreshold -> 35.millivolts)

course, we can add inhibitory connections between neurons of the same group, in order to enable competitive learning.

Let's add inhibitory connections between neurons of the same group, in order to enable competitive learning.

"group_1" connectsTo "group_1" using FullConnection withSynapse InhibitorySynapse

And we should not forget about the input either

"input" connectsTo "group_1" using  FullConnection withSynapse SimplifiedSTDP

Now that the network is all set up, we can build the network

network buildit

Now to observe the working of the example we can call the network observer between input and the created group

observeConnectionBetween("input", "group_1")

Now let's define paths for training and testing dataset and labels

val dataFile = N2S3ResourceManager.getByName("mnist-train-images").getAbsolutePath
val labelFile =  N2S3ResourceManager.getByName("mnist-train-labels").getAbsolutePath
 
val dataTestFile = N2S3ResourceManager.getByName("mnist-test-images").getAbsolutePath
val labelTestFile =  N2S3ResourceManager.getByName("mnist-test-labels").getAbsolutePath

We can train the network

network trainOn MnistFileInputStream(dataFile, labelFile)

For defining special method to directly train the network using MNIST without specifiying paths to training datasets and labels, type the statement below. To define such function for your example please refer to DSL Documentation Report

network trainMNIST

And once it is trained, we can test it efficiency

network testOn MnistFileInputStream( dataTestFile, labelTestFile)

For defining special method to directly test the network using MNIST without specifiying paths to testing datasets and labels, type the statement below. To define such function for your example please refer to DSL Documentation Report

network testMNIST

We can now destroy the network

network destroyit

Note: To remove any error due to “Optional Dots in the Method Invocation”, leave a blank line after each statement in the code.

dsl.txt · Last modified: 2018/07/09 09:34 by animeshsri.nith@gmail.com