Note
This documentation is for a development version. Click here for the latest stable release (v0.4.2).
Basic usage¶
The standard Legendre Memory Unit (LMU) layer
implementation in KerasLMU is defined in the
keras_lmu.LMU
class. The following code creates
a new LMU layer:
import keras
import keras_lmu
lmu_layer = keras_lmu.LMU(
memory_d=1,
order=256,
theta=784,
hidden_cell=keras.layers.SimpleRNNCell(units=10),
)
Note that the values used above for memory_d
, order
,
theta
, and units
are arbitrary example values; actual parameter settings will
depend on your specific application.
memory_d
represents the dimensionality of the signal represented in the LMU memory,
order
represents the dimensionality of the LMU basis,
theta
represents the dimensionality of the sliding window,
and units
represents the dimensionality of the hidden component.
To learn more about these parameters, check out
the LMU class API reference.
Creating KerasLMU layers¶
The LMU
class functions as a standard
Keras layer and is meant to be used within a Keras model.
The code below illustrates how to do this using a Keras model with
a 10-dimensional input and a 20-dimensional output.
inputs = keras.Input((None, 10))
lmus = lmu_layer(inputs)
outputs = keras.layers.Dense(20)(lmus)
model = keras.Model(inputs=inputs, outputs=outputs)
Other parameters¶
The LMU
class has several other configuration options; see
the API reference for all the details.