Configuration system¶
NengoDL uses Nengo’s config system to allow users to control more fine-grained aspects of the simulation. In general, most users will not need to worry about these options, and can leave them at their default settings. However, these options may be useful in some scenarios.
configure_settings()
is a utility function that can be used to set these
configuration options. It needs to be called within a Network context, as in:
with nengo.Network() as net:
nengo_dl.configure_settings(config_option=config_value, ...)
...
Each call to configure_settings
only sets the configuration
options specified to that call. That is,
nengo_dl.configure_settings(option0=val0)
nengo_dl.configure_settings(option1=val1)
is equivalent to
nengo_dl.configure_settings(option0=val0, option1=val1)
Under the hood, configure_settings
is setting config
attributes on
the top-level network. All of the same effects could be achieved by setting
those config
attributes directly, configure_settings
simply makes this
an easier process.
Options¶
trainable¶
The trainable
config attribute can be used to control which parts of a
model will be optimized by the Simulator.train()
process.
configure_settings(trainable=None)
will add a configurable trainable
attribute to the objects in a network. Setting trainable=None
will use the
default trainability settings, or trainable=True/False
can be used to
override the default for all objects.
Once the trainable
attribute has been added to all the objects in a model,
the config
system can then be used to control the trainability of
individual objects/networks.
For example, suppose we only want to optimize one connection in our network, while leaving everything else unchanged. This could be achieved via
with nengo.Network() as net:
# this adds the `trainable` attribute to all the trainable objects
# in the network, and initializes it to `False`
nengo_dl.configure_settings(trainable=False)
a = nengo.Node([0])
b = nengo.Ensemble(10, 1)
c = nengo.Node(size_in=1)
nengo.Connection(a, b)
# make this specific connection trainable
conn = nengo.Connection(b, c)
net.config[conn].trainable = True
Or if we wanted to disable training for some subnetwork:
with nengo.Network() as net:
nengo_dl.configure_settings(trainable=None)
...
with nengo.Network() as subnet:
net.config[subnet].trainable = False
...
Note that config[nengo.Ensemble].trainable
controls both encoders and
biases, as both are properties of an Ensemble. However, it is possible to
separately control the biases via config[nengo.ensemble.Neurons].trainable
or config[my_ensemble.neurons].trainable
.
There are two important caveats to keep in mind when configuring trainable
,
which differ from the standard config behaviour:
trainable
applies to all objects in a network, regardless of whether they were created before or aftertrainable
is set. For example,with nengo.Network() as net: ... net.config[nengo.Ensemble].trainable = False a = nengo.Ensemble(10, 1) ...
is the same as
with nengo.Network() as net: ... a = nengo.Ensemble(10, 1) net.config[nengo.Ensemble].trainable = False ...
trainable
can only be set on the config of the top-level network. For example,with nengo.Network() as net: nengo_dl.configure_settings(trainable=None) with nengo.Network() as subnet: my_ens = nengo.Ensemble(...) # incorrect subnet.config[my_ens].trainable = False # correct net.config[my_ens].trainable = False
planner/sorter/simplifications¶
These options can be used to change the algorithm used for different aspects
of the graph optimization stage. For example, we could change the planning
algorithm to the graph_optimizer.transitive_planner()
via
from nengo_dl.graph_optimizer import transitive_planner
with nengo.Network() as net:
nengo_dl.configure_settings(planner=transitive_planner)
session_config¶
TensorFlow has its own configuration options
which can control various aspects of the TensorFlow Session.
session_config
can be used to set those options on the underlying NengoDL
simulator Session. These are specified as a dictionary mapping config names
to values. For example, if in TensorFlow we wanted to do
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
sess = tf.Session(..., config=config)
the equivalent in NengoDL would be
nengo_dl.configure_settings(
session_config={"gpu_options.allow_growth": True})
inference_only¶
By default, NengoDL models are built to support both training and inference.
However, sometimes we may know that we’ll only be using a simulation for
inference (for example, if we want to take advantage of the batching/GPU
acceleration of NengoDL, but don’t need the sim.train
functionality). In
that case we can improve the simulation speed of the model by omitting some
of the aspects related to training. Setting
nengo_dl.configure_settings(inference_only=True)
will cause the network
to be built in inference-only mode.
lif_smoothing¶
During training, NengoDL automatically replaces the non-differentiable
spiking LIF
neuron model with the differentiable
LIFRate
approximation.
However, although LIFRate
is generally differentiable, it has a sharp
discontinuity at the firing threshold. In some cases this can lead to
difficulties during the training process, and performance can be improved by
smoothing the LIFRate
response around the firing threshold. This is
known as the SoftLIFRate
neuron model.
SoftLIFRate
has a parameter sigma
that controls the degree of smoothing
(SoftLIFRate
approaches LIFRate
as sigma
goes to zero). Setting
nengo_dl.configure_settings(lif_smoothing=x)
will cause the LIF
gradients to be approximated by SoftLIFRate
instead of LIFRate
, with
sigma=x
.
dtype¶
This specifies the floating point precision to be used for the simulator’s
internal computations. It can be either tf.float32
or tf.float64
,
for 32 or 64-bit precision, respectively. 32-bit precision is the default,
as it is faster, will use less memory, and in most cases will not make a
difference in the results of the simulation. However, if very precise outputs
are required then this can be changed to tf.float64
.
API¶
-
nengo_dl.config.
configure_settings
(**kwargs)[source]¶ Pass settings to
nengo_dl
by setting them as parameters on the top-level Network config.The settings are passed as keyword arguments to
configure_settings
; e.g., to settrainable
useconfigure_settings(trainable=True)
.Parameters: - trainable : bool or None
Adds a parameter to Nengo Ensembles/Connections/Networks that controls whether or not they will be optimized by
Simulator.train()
. PassingNone
will use the defaultnengo_dl
trainable settings, or True/False will override the default for all objects. In either case trainability can be further configured on a per-object basis (e.g.net.config[my_ensemble].trainable = True
. See the documentation for more details.- planner : graph planning algorithm
Pass one of the graph planners to change the default planner.
- sorter : signal sorting algorithm
Pass one of the sort algorithms to change the default sorter.
- simplifications: list of graph simplification functions
Pass a list of graph simplification functions to change the default simplifications applied.
- session_config: dict
Config options passed to
tf.Session
initialization (e.g., to change the GPU memory allocation method pass{"gpu_options.allow_growth": True}
).- inference_only : bool
Set to True if the network will only be run in inference mode (i.e., no calls to
Simulator.train()
). This may result in a small increase in the inference speed.- lif_smoothing : float
If specified, use the smoothed
SoftLIFRate
neuron model, with the given smoothing parameter (sigma
), to compute the gradient forLIF
neurons (as opposed to usingLIFRate
).- dtype :
tf.DType
Set the floating point precision for simulation values.
-
nengo_dl.config.
get_setting
(model, setting, default=None)[source]¶ Returns config settings (created by
configure_settings()
).Parameters: Returns: - Value of ``setting`` if it has been specified, else ``default``.