Tutorial 1: Building and Using CANN Models¶
Note
Reading Time: 25-30 minutes
Difficulty: Beginner
Prerequisites: Python basics, NumPy/JAX array operations
This tutorial will help you understand how models are constructed in the CANNs library and how to use the built-in Wu-Amari-Wong (WAW) continuous attractor neural networks (CANNs) [5, 6, 7, 8] models.
1. Introduction to BrainPy Framework¶
All models in the CANNs library are built on the BrainPy framework [18]. BrainPy is a core framework for dynamical systems in the Brain Simulation Ecosystem, built on JAX with JIT compilation and automatic differentiation support.
1.1 Core Concepts¶
Before we begin, you need to understand these key concepts:
Dynamics Abstraction¶
All CANN models inherit from bp.DynamicalSystem—a base class for defining dynamical systems. It provides:
State management mechanisms
Time step management
JIT compilation support
[1]:
import brainpy as bp
import brainpy.math as bm
class MyModel(bp.DynamicalSystem):
def update(self, inp):
# Define single-step dynamics update
pass
State Containers¶
BrainPy provides state containers for managing different types of variables:
Container Type |
Purpose |
Example |
|---|---|---|
|
All state variables |
External stimulus |
[2]:
def __init__(self):
super().__init__()
# State variable: neuron membrane potential
self.u = bm.Variable(bm.zeros(self.num))
# State variable: neuron firing rate
self.r = bm.Variable(bm.zeros(self.num))
# External input state
self.inp = bm.Variable(bm.zeros(self.num))
Time Step Management¶
BrainPy manages simulation time steps through bm.set_dt():
[3]:
import brainpy.math as bm # :cite:p:`wang2023brainpy`
# Set simulation time step (unit: milliseconds)
bm.set_dt(0.1)
# Get current time step in the model
dt = bm.get_dt()
Important: You must set the time step dt before running any simulation—otherwise the model will raise an error.
Further Learning¶
To learn more about the BrainPy framework [18], see:
2. CANN1D Implementation Analysis¶
Let’s use CANN1D as an example to understand how a complete CANN model is implemented.
2.1 Model Inheritance Structure¶
bp.DynamicalSystem
└── BasicModel
└── BaseCANN
└── BaseCANN1D
└── CANN1D
2.2 Initialization Method __init__¶
The CANN1D initialization method defines all model parameters:
[ ]:
class CANN1D(BaseCANN1D):
def __init__(
self,
num: int, # Number of neurons
tau: float = 1.0, # Time constant
k: float = 8.1, # Global inhibition strength
a: float = 0.5, # Connection width
A: float = 10, # External input amplitude
J0: float = 4.0, # Synaptic connection strength
z_min: float = -π, # Feature space minimum
z_max: float = π, # Feature space maximum
**kwargs,
):
...
These parameters control the network’s dynamical behavior. We will explore each parameter’s effect in detail in Tutorial 4.
2.3 Connection Matrix Generation make_conn¶
The make_conn method generates the connectivity matrix between neurons. CANN uses a Gaussian connection kernel—neurons with similar feature preferences have stronger excitatory connections:
[ ]:
def make_conn(self):
# Calculate distances between all neuron pairs
x_left = bm.reshape(self.x, (-1, 1))
x_right = bm.repeat(self.x.reshape((1, -1)), len(self.x), axis=0)
d = self.dist(x_left - x_right)
# Compute connection strength using Gaussian function
return (
self.J0
* bm.exp(-0.5 * bm.square(d / self.a))
/ (bm.sqrt(2 * bm.pi) * self.a)
)
2.4 Stimulus Generation get_stimulus_by_pos¶
get_stimulus_by_pos generates external stimulus (a Gaussian-shaped bump) based on a given position in feature space:
[ ]:
def get_stimulus_by_pos(self, pos):
return self.A * bm.exp(
-0.25 * bm.square(self.dist(self.x - pos) / self.a)
)
This method is called by the task module to generate input data.
2.5 Dynamics Update update¶
The update method defines the single-step dynamics update of the network:
[ ]:
def update(self, inp):
self.inp.value = inp
# Compute firing rate (divisive normalization)
r1 = bm.square(self.u.value)
r2 = 1.0 + self.k * bm.sum(r1)
self.r.value = r1 / r2
# Compute recurrent input
Irec = bm.dot(self.conn_mat, self.r.value)
# Update membrane potential using Euler method
self.u.value += (
(-self.u.value + Irec + self.inp.value)
/ self.tau * bm.get_dt()
)
3. How to Use Built-in CANN Models¶
Now let’s learn how to actually use the built-in CANN models.
3.1 Basic Usage Workflow¶
[4]:
import brainpy.math as bm
from canns.models.basic import CANN1D
# Step 1: Set time step
bm.set_dt(0.1)
# Step 2: Create model instance
model = CANN1D(
num=256, # 256 neurons
tau=1.0, # Time constant
k=8.1, # Global inhibition
a=0.5, # Connection width
A=10, # Input amplitude
J0=4.0, # Connection strength
)
# Step 3: View model information
print(f"Number of neurons: {model.shape}")
print(f"Feature space range: [{model.z_min}, {model.z_max}]")
print(f"Connection matrix shape: {model.conn_mat.shape}")
Number of neurons: (256,)
Feature space range: [-3.141592653589793, 3.141592653589793]
Connection matrix shape: (256, 256)
3.2 Running a Single Step Update¶
[5]:
# Generate external stimulus at pos=0
pos = 0.0
stimulus = model.get_stimulus_by_pos(pos)
# Run two step update
model(stimulus) # or you can explicitly call model.update(stimulus)
model(stimulus)
# View current state
print(f"Firing rate shape: {model.r.value.shape}")
print(f"Max firing rate: {bm.max(model.r.value):.4f}")
print(f"Max membrane potential: {bm.max(model.u.value):.4f}")
Firing rate shape: (256,)
Max firing rate: 0.0024
Max membrane potential: 1.9275
3.3 Complete Example¶
Here’s a complete example of creating and testing a CANN1D model:
[6]:
import brainpy.math as bm # :cite:p:`wang2023brainpy`
from canns.models.basic import CANN1D
# Setup environment
bm.set_dt(0.1)
# Create model (auto-initializes)
model = CANN1D(num=256, tau=1.0, k=8.1, a=0.5, A=10, J0=4.0)
# Print basic model information
print("=" * 50)
print("CANN1D Model Information")
print("=" * 50)
print(f"Number of neurons: {model.shape}")
print(f"Time constant tau: {model.tau}")
print(f"Global inhibition k: {model.k}")
print(f"Connection width a: {model.a}")
print(f"Input amplitude A: {model.A}")
print(f"Connection strength J0: {model.J0}")
print(f"Feature space: [{model.z_min:.2f}, {model.z_max:.2f}]")
print(f"Neural density rho: {model.rho:.2f}")
# Test stimulus generation
pos = 0.5
stimulus = model.get_stimulus_by_pos(pos)
print(f"\nStimulus position: {pos}")
print(f"Stimulus shape: {stimulus.shape}")
print(f"Max stimulus value: {bm.max(stimulus):.4f}")
# Run several update steps
print("\nRunning 100 update steps...")
for _ in range(100):
model(stimulus)
print(f"Max firing rate: {bm.max(model.r.value):.6f}")
print(f"Max membrane potential: {bm.max(model.u.value):.6f}")
==================================================
CANN1D Model Information
==================================================
Number of neurons: (256,)
Time constant tau: 1.0
Global inhibition k: 8.1
Connection width a: 0.5
Input amplitude A: 10
Connection strength J0: 4.0
Feature space: [-3.14, 3.14]
Neural density rho: 40.74
Stimulus position: 0.5
Stimulus shape: (256,)
Max stimulus value: 9.9997
Running 100 update steps...
Max firing rate: 0.002427
Max membrane potential: 10.278063
4. Overview of Built-in Models¶
The CANNs library provides three categories of built-in models:
Basic Models¶
Standard CANN implementations and variants:
CANN1D—1D continuous attractor neural networkCANN1D_SFA—CANN1D with Spike Frequency AdaptationCANN2D—2D continuous attractor neural networkCANN2D_SFA—CANN2D with SFAHierarchical Path Integration networks (Grid Cell, Place Cell, Band Cell, etc.)
Theta Sweep models
Brain-Inspired Models¶
Learning models based on neuroscience principles:
Hopfield networks
…
Hybrid Models¶
Combinations of CANN with artificial neural networks (under development).
Detailed Information: See the Core Concepts documentation for a complete list of models and use cases.
Congratulations on completing the first CANN modeling tutorial! You now understand:
The basic CANN architecture and its neural dynamics
How to create CANN1D and CANN2D models
The role of key parameters (
num,k,tau,a,A,J0)How to run simulations and observe network states
What You’ve Learned¶
- Model Creation
You can instantiate CANN models with custom parameters and understand what each parameter controls.
- Network Dynamics
You understand the continuous attractor dynamics—how local excitation and global inhibition create stable activity bumps.
- State Variables
You know how to access internal states (
ufor synaptic input,rfor firing rates).
Continue Learning¶
Now you’re ready for the next tutorial:
Tutorial 2: Task Generation and Simulation—Learn how to generate navigation tasks and run complete simulations with external inputs
From there, continue with:
Tutorial 3: Analysis and Visualization—Learn visualization tools for CANN dynamics
Tutorial 4: Parameter Effects—Systematic exploration of parameter effects
Key Takeaways¶
CANNs are dynamical systems—They evolve over time according to differential equations
Bumps are attractors—The localized activity patterns are stable fixed points of the dynamics
Parameters matter—Different parameter choices create different dynamical behaviors
BrainPy handles complexity—The framework manages state variables and time stepping automatically