References

This page lists all references cited throughout the CANNs documentation.

Note

To cite these references in your documentation or notebooks, use the :cite: role. For example: :cite:`wu2008dynamics` renders as [Wu08].

Complete Bibliography

[1]

John O'Keefe and Jonathan Dostrovsky. The hippocampus as a spatial map: preliminary evidence from unit activity in the freely-moving rat. Brain research, 1971. doi:10.1016/0006-8993(71)90358-1.

[2]

Torkel Hafting, Marianne Fyhn, Sturla Molden, May-Britt Moser, and Edvard I Moser. Microstructure of a spatial map in the entorhinal cortex. Nature, 436(7052):801–806, 2005. doi:10.1038/nature03721.

[3]

Jeffrey S Taube, Robert U Muller, and James B Ranck. Head-direction cells recorded from the postsubiculum in freely moving rats. i. description and quantitative analysis. Journal of Neuroscience, 10(2):420–435, 1990. doi:10.1523/JNEUROSCI.10-02-00420.1990.

[4]

Bruce L McNaughton, Francesco P Battaglia, Ole Jensen, Edvard I Moser, and May-Britt Moser. Path integration and the neural basis of the'cognitive map'. Nature Reviews Neuroscience, 7(8):663–678, 2006. doi:10.1038/nrn1932.

[5]

Shun-ichi Amari. Dynamics of pattern formation in lateral-inhibition type neural fields. Biological cybernetics, 27(2):77–87, 1977. doi:10.1007/BF00337259.

[6]

Si Wu, Kosuke Hamaguchi, and Shun-ichi Amari. Dynamics and computation of continuous attractors. Neural computation, 20(4):994–1025, 2008. doi:10.1162/neco.2008.10-06-378.

[7]

CC Alan Fung, KY Michael Wong, and Si Wu. A moving bump in a continuous manifold: a comprehensive study of the tracking dynamics of continuous attractor neural networks. Neural Computation, 22(3):752–792, 2010. doi:10.1162/neco.2009.07-08-824.

[8]

Si Wu, KY Michael Wong, CC Alan Fung, Yuanyuan Mi, and Wenhao Zhang. Continuous attractor neural networks: candidate of a canonical model for neural information representation. F1000Research, 5:F1000–Faculty, 2016. doi:10.12688/f1000research.7387.1.

[9]

Yuanyuan Mi, CC Fung, KY Wong, and Si Wu. Spike frequency adaptation implements anticipative tracking in continuous attractor neural networks. Advances in neural information processing systems, 2014.

[10]

Yujun Li, Tianhao Chu, and Si Wu. Dynamics of continuous attractor neural networks with spike frequency adaptation. Neural Computation, 37(6):1057–1101, 2025. doi:10.1162/neco_a_01757.

[11]

Zilong Ji, Tianhao Chu, Si Wu, and Neil Burgess. A systems model of alternating theta sweeps via firing rate adaptation. Current Biology, 35(4):709–722, 2025. doi:10.1016/j.cub.2024.08.059.

[12]

Yingxue Wang, Sandro Romani, Brian Lustig, Anthony Leonardo, and Eva Pastalkova. Theta sequences are essential for internally generated hippocampal firing fields. Nature neuroscience, 18(2):282–288, 2015. doi:10.1038/nn.3904.

[13]

Zilong Ji, Eleonora Lomi, Kate Jeffery, Anna S Mitchell, and Neil Burgess. Phase precession relative to turning angle in theta-modulated head direction cells. Hippocampus, 35(2):e70008, 2025. doi:10.1002/hipo.70008.

[14]

Tianhao Chu, Zilong Ji, Junfeng Zuo, Yuanyuan Mi, Wen-hao Zhang, Tiejun Huang, Daniel Bush, Neil Burgess, and Si Wu. Firing rate adaptation affords place cell theta sweeps, phase precession, and procession. Elife, 12:RP87055, 2024. doi:10.7554/eLife.87055.4.

[15]

Gunnar Carlsson. Topology and data. Bulletin of the American Mathematical Society, 46(2):255–308, 2009. doi:10.1090/S0273-0979-09-01249-X.

[16]

Herbert Edelsbrunner and John Harer. Computational topology: an introduction. American Mathematical Soc., 2010.

[17]

James Bradbury, Roy Frostig, Peter Hawkins, Matthew James Johnson, Chris Leary, Dougal Maclaurin, George Necula, Adam Paszke, Jake VanderPlas, Skye Wanderman-Milne, and Qiao Zhang. JAX: composable transformations of Python+NumPy programs. 2018. URL: http://github.com/jax-ml/jax.

[18]

Chaoming Wang, Tianqiu Zhang, Xiaoyu Chen, Sichao He, Shangyang Li, and Si Wu. Brainpy, a flexible, integrative, efficient, and extensible framework for general-purpose brain dynamics programming. elife, 12:e86365, 2023. doi:10.7554/eLife.86365.

[19]

Tianhao Chu, Yuling Wu, Wentao Qiu, Zihao Jiang, Neil Burgess, Bo Hong, and Si Wu. Localized space coding and phase coding complement each other to achieve robust and efficient spatial representation. bioRxiv, pages 2025–09, 2025.

[20]

Yoram Burak and Ila R Fiete. Accurate path integration in continuous attractor network models of grid cells. PLoS computational biology, 5(2):e1000291, 2009.

[21]

John J Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8):2554–2558, 1982. doi:10.1073/pnas.79.8.2554.

[22]

Donald Olding Hebb. The organization of behavior: A neuropsychological theory. Psychology press, 2005.

[23]

S-I Amari. Neural theory of association and concept-formation. Biological cybernetics, 26(3):175–185, 1977. doi:10.1007/BF00365229.

[24]

Guo-qiang Bi and Mu-ming Poo. Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. Journal of neuroscience, 18(24):10464–10472, 1998. doi:10.1523/JNEUROSCI.18-24-10464.1998.

[25]

Erkki Oja. Simplified neuron model as a principal component analyzer. Journal of mathematical biology, 15(3):267–273, 1982. doi:10.1007/BF00275687.

[26]

Terence D Sanger. Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural networks, 2(6):459–473, 1989. doi:10.1016/0893-6080(89)90044-0.

[27]

Ariane S Etienne and Kathryn J Jeffery. Path integration in mammals. Hippocampus, 14(2):180–192, 2004. doi:10.1002/hipo.10173.

[28]

Alexei Samsonovich and Bruce L McNaughton. Path integration and cognitive mapping in a continuous attractor neural network model. Journal of Neuroscience, 17(15):5900–5920, 1997. doi:10.1523/JNEUROSCI.17-15-05900.1997.

[29]

David Sussillo and Omri Barak. Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks. Neural computation, 25(3):626–649, 2013. doi:10.1162/NECO_a_00409.

[30]

Matthew D Golub and David Sussillo. Fixedpointfinder: a tensorflow toolbox for identifying and characterizing fixed points in recurrent neural networks. Journal of open source software, 3(31):1003, 2018. doi:10.21105/joss.01003.

[31]

Elie L Bienenstock, Leon N Cooper, and Paul W Munro. Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. Journal of Neuroscience, 2(1):32–48, 1982. doi:10.1523/JNEUROSCI.02-01-00032.1982.

[32]

John O'Keefe and Michael L Recce. Phase relationship between hippocampal place units and the eeg theta rhythm. Hippocampus, 3(3):317–330, 1993. doi:10.1002/hipo.450030307.

[33]

Sichao He. CANNs: continuous attractor neural networks toolkit. 2025. URL: https://github.com/Routhleck/canns, doi:10.5281/zenodo.17412545.

How to Cite References

In RST Files

Use the :cite: role in your text:

The dynamics of continuous attractors were analyzed by :cite:`wu2008dynamics`.
Foundational work includes :cite:`amari1977dynamics` and :cite:`wu2016continuous`.

In Jupyter Notebooks

Important: In Jupyter notebooks, you must use raw cells with reStructuredText format, not markdown cells.

  1. Create a raw cell (Cell → Cell Type → Raw)

  2. Set the cell metadata to indicate RST format:

    {
      "raw_mimetype": "text/restructuredtext"
    }
    
  3. Write RST content with citations:

    This is a paragraph with citations :cite:p:`amari1977dynamics,wu2008dynamics`.
    
  4. Add a bibliography directive at the end of the notebook (in another raw RST cell):

    References
    ----------
    
    .. bibliography::
       :cited:
       :style: alpha
    

Citation Styles:

  • :cite:p:`key` - Parenthetical: (Author, Year) - entire citation is clickable

  • :cite:t:`key` - Textual: Author [Year] - only year is clickable

Example: See docs/en/0_why_canns.ipynb for a complete working example.

Adding New References

To add new references to the bibliography:

  1. Open docs/refs/references.bib

  2. Add your BibTeX entry following the existing format

  3. Use a consistent citation key format: authorYEARkeyword (e.g., wu2008dynamics)

  4. Cite the reference using :cite:`citationkey`

  5. The reference will automatically appear in this bibliography

For more information, see the sphinxcontrib-bibtex documentation.