References¶
This page lists all references cited throughout the CANNs documentation.
Note
To cite these references in your documentation or notebooks, use the :cite: role.
For example: :cite:`wu2008dynamics` renders as [Wu08].
Complete Bibliography¶
John O'Keefe and Jonathan Dostrovsky. The hippocampus as a spatial map: preliminary evidence from unit activity in the freely-moving rat. Brain research, 1971. doi:10.1016/0006-8993(71)90358-1.
Torkel Hafting, Marianne Fyhn, Sturla Molden, May-Britt Moser, and Edvard I Moser. Microstructure of a spatial map in the entorhinal cortex. Nature, 436(7052):801–806, 2005. doi:10.1038/nature03721.
Jeffrey S Taube, Robert U Muller, and James B Ranck. Head-direction cells recorded from the postsubiculum in freely moving rats. i. description and quantitative analysis. Journal of Neuroscience, 10(2):420–435, 1990. doi:10.1523/JNEUROSCI.10-02-00420.1990.
Bruce L McNaughton, Francesco P Battaglia, Ole Jensen, Edvard I Moser, and May-Britt Moser. Path integration and the neural basis of the'cognitive map'. Nature Reviews Neuroscience, 7(8):663–678, 2006. doi:10.1038/nrn1932.
Shun-ichi Amari. Dynamics of pattern formation in lateral-inhibition type neural fields. Biological cybernetics, 27(2):77–87, 1977. doi:10.1007/BF00337259.
Si Wu, Kosuke Hamaguchi, and Shun-ichi Amari. Dynamics and computation of continuous attractors. Neural computation, 20(4):994–1025, 2008. doi:10.1162/neco.2008.10-06-378.
CC Alan Fung, KY Michael Wong, and Si Wu. A moving bump in a continuous manifold: a comprehensive study of the tracking dynamics of continuous attractor neural networks. Neural Computation, 22(3):752–792, 2010. doi:10.1162/neco.2009.07-08-824.
Si Wu, KY Michael Wong, CC Alan Fung, Yuanyuan Mi, and Wenhao Zhang. Continuous attractor neural networks: candidate of a canonical model for neural information representation. F1000Research, 5:F1000–Faculty, 2016. doi:10.12688/f1000research.7387.1.
Yuanyuan Mi, CC Fung, KY Wong, and Si Wu. Spike frequency adaptation implements anticipative tracking in continuous attractor neural networks. Advances in neural information processing systems, 2014.
Yujun Li, Tianhao Chu, and Si Wu. Dynamics of continuous attractor neural networks with spike frequency adaptation. Neural Computation, 37(6):1057–1101, 2025. doi:10.1162/neco_a_01757.
Zilong Ji, Tianhao Chu, Si Wu, and Neil Burgess. A systems model of alternating theta sweeps via firing rate adaptation. Current Biology, 35(4):709–722, 2025. doi:10.1016/j.cub.2024.08.059.
Yingxue Wang, Sandro Romani, Brian Lustig, Anthony Leonardo, and Eva Pastalkova. Theta sequences are essential for internally generated hippocampal firing fields. Nature neuroscience, 18(2):282–288, 2015. doi:10.1038/nn.3904.
Zilong Ji, Eleonora Lomi, Kate Jeffery, Anna S Mitchell, and Neil Burgess. Phase precession relative to turning angle in theta-modulated head direction cells. Hippocampus, 35(2):e70008, 2025. doi:10.1002/hipo.70008.
Tianhao Chu, Zilong Ji, Junfeng Zuo, Yuanyuan Mi, Wen-hao Zhang, Tiejun Huang, Daniel Bush, Neil Burgess, and Si Wu. Firing rate adaptation affords place cell theta sweeps, phase precession, and procession. Elife, 12:RP87055, 2024. doi:10.7554/eLife.87055.4.
Gunnar Carlsson. Topology and data. Bulletin of the American Mathematical Society, 46(2):255–308, 2009. doi:10.1090/S0273-0979-09-01249-X.
Herbert Edelsbrunner and John Harer. Computational topology: an introduction. American Mathematical Soc., 2010.
James Bradbury, Roy Frostig, Peter Hawkins, Matthew James Johnson, Chris Leary, Dougal Maclaurin, George Necula, Adam Paszke, Jake VanderPlas, Skye Wanderman-Milne, and Qiao Zhang. JAX: composable transformations of Python+NumPy programs. 2018. URL: http://github.com/jax-ml/jax.
Chaoming Wang, Tianqiu Zhang, Xiaoyu Chen, Sichao He, Shangyang Li, and Si Wu. Brainpy, a flexible, integrative, efficient, and extensible framework for general-purpose brain dynamics programming. eLife, 12:e86365, dec 2023. URL: https://doi.org/10.7554/eLife.86365, doi:10.7554/eLife.86365.
Tianhao Chu, Yuling Wu, Wentao Qiu, Zihao Jiang, Neil Burgess, Bo Hong, and Si Wu. Localized space coding and phase coding complement each other to achieve robust and efficient spatial representation. bioRxiv, pages 2025–09, 2025.
Yoram Burak and Ila R Fiete. Accurate path integration in continuous attractor network models of grid cells. PLoS computational biology, 5(2):e1000291, 2009.
John J Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8):2554–2558, 1982. doi:10.1073/pnas.79.8.2554.
Donald Olding Hebb. The organization of behavior: A neuropsychological theory. Psychology press, 2005.
S-I Amari. Neural theory of association and concept-formation. Biological cybernetics, 26(3):175–185, 1977. doi:10.1007/BF00365229.
Guo-qiang Bi and Mu-ming Poo. Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. Journal of neuroscience, 18(24):10464–10472, 1998. doi:10.1523/JNEUROSCI.18-24-10464.1998.
Erkki Oja. Simplified neuron model as a principal component analyzer. Journal of mathematical biology, 15(3):267–273, 1982. doi:10.1007/BF00275687.
Terence D Sanger. Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural networks, 2(6):459–473, 1989. doi:10.1016/0893-6080(89)90044-0.
Ariane S Etienne and Kathryn J Jeffery. Path integration in mammals. Hippocampus, 14(2):180–192, 2004. doi:10.1002/hipo.10173.
Alexei Samsonovich and Bruce L McNaughton. Path integration and cognitive mapping in a continuous attractor neural network model. Journal of Neuroscience, 17(15):5900–5920, 1997. doi:10.1523/JNEUROSCI.17-15-05900.1997.
Melvin Vaupel, Erik Hermansen, and Benjamin A. Dunn. A topological perspective on the dual nature of the neural state space and the correlation structure. bioRxiv, 2023. URL: https://doi.org/10.1101/2023.10.17.562775, doi:10.1101/2023.10.17.562775.
Richard J. Gardner, Erik Hermansen, Marius Pachitariu, Yoram Burak, Nils A. Baas, Benjamin A. Dunn, May-Britt Moser, and Edvard I. Moser. Toroidal topology of population activity in grid cells. Nature, 602:123–128, 2022. URL: https://doi.org/10.1038/s41586-021-04268-7, doi:10.1038/s41586-021-04268-7.
David Sussillo and Omri Barak. Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks. Neural computation, 25(3):626–649, 2013. doi:10.1162/NECO_a_00409.
Matthew D Golub and David Sussillo. Fixedpointfinder: a tensorflow toolbox for identifying and characterizing fixed points in recurrent neural networks. Journal of open source software, 3(31):1003, 2018. doi:10.21105/joss.01003.
Elie L Bienenstock, Leon N Cooper, and Paul W Munro. Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. Journal of Neuroscience, 2(1):32–48, 1982. doi:10.1523/JNEUROSCI.02-01-00032.1982.
John O'Keefe and Michael L Recce. Phase relationship between hippocampal place units and the eeg theta rhythm. Hippocampus, 3(3):317–330, 1993. doi:10.1002/hipo.450030307.
Sung Soo Kim, Hervé Rouault, Shaul Druckmann, and Vivek Jayaraman. Ring attractor dynamics in the drosophila central brain. Science, 356(6340):849–853, 2017. URL: https://doi.org/10.1126/science.aal4835, doi:10.1126/science.aal4835.
Sichao He. CANNs: continuous attractor neural networks toolkit. 2025. URL: https://github.com/Routhleck/canns, doi:10.5281/zenodo.17412545.
Marc-Oliver Gewaltig and Markus Diesmann. Nest (neural simulation tool). Scholarpedia, 2(4):1430, 2007.
Marcel Stimberg, Romain Brette, and Dan FM Goodman. Brian 2, an intuitive and efficient neural simulator. elife, 8:e47314, 2019.
Michael L Hines and Nicholas T Carnevale. The neuron simulation environment. Neural computation, 9(6):1179–1209, 1997.
Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Clement Delangue, Anthony Moi, Pierric Cistac, Tim Rault, Rémi Louf, Morgan Funtowicz, Joe Davison, Sam Shleifer, Patrick von Platen, Clara Ma, Yacine Jernite, Julien Plu, Canwen Xu, Teven Le Scao, Sylvain Gugger, Mariama Drame, Quentin Lhoest, and Alexander M. Rush. Transformers: state-of-the-art natural language processing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, 38–45. Online, October 2020. Association for Computational Linguistics. URL: https://www.aclweb.org/anthology/2020.emnlp-demos.6.
Jason Ansel, Edward Yang, Horace He, Natalia Gimelshein, Animesh Jain, Michael Voznesensky, Bin Bao, Peter Bell, David Berard, Evgeni Burovski, Geeta Chauhan, Anjali Chourdia, Will Constable, Alban Desmaison, Zachary DeVito, Elias Ellison, Will Feng, Jiong Gong, Michael Gschwind, Brian Hirsh, Sherlock Huang, Kshiteej Kalambarkar, Laurent Kirsch, Michael Lazos, Mario Lezcano, Yanbo Liang, Jason Liang, Yinghai Lu, CK Luk, Bert Maher, Yunjie Pan, Christian Puhrsch, Matthias Reso, Mark Saroufim, Marcos Yukio Siraichi, Helen Suk, Michael Suo, Phil Tillet, Eikan Wang, Xiaodong Wang, William Wen, Shunting Zhang, Xu Zhao, Keren Zhou, Richard Zou, Ajit Mathews, Gregory Chanan, Peng Wu, and Soumith Chintala. PyTorch 2: Faster Machine Learning Through Dynamic Python Bytecode Transformation and Graph Compilation. In 29th ACM International Conference on Architectural Support for Programming Languages and Operating Systems, Volume 2 (ASPLOS '24). ACM, April 2024. URL: https://docs.pytorch.org/assets/pytorch2-2.pdf, doi:10.1145/3620665.3640366.
Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Rafal Jozefowicz, Yangqing Jia, Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dan Mané, Mike Schuster, Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Jonathon Shlens, Benoit Steiner, Ilya Sutskever, Kunal Talwar, Paul Tucker, Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viégas, Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke, Yuan Yu, and Xiaoqiang Zheng. TensorFlow, Large-scale machine learning on heterogeneous systems. November 2015. doi:10.5281/zenodo.4724125.
TorchVision maintainers and contributors. Torchvision: pytorch's computer vision library. https://github.com/pytorch/vision, 2016.
How to Cite References¶
In RST Files¶
Use the :cite: role in your text:
The dynamics of continuous attractors were analyzed by :cite:`wu2008dynamics`.
Foundational work includes :cite:`amari1977dynamics` and :cite:`wu2016continuous`.
In Jupyter Notebooks¶
Important: In Jupyter notebooks, you must use raw cells with reStructuredText format, not markdown cells.
Create a raw cell (Cell → Cell Type → Raw)
Set the cell metadata to indicate RST format:
{ "raw_mimetype": "text/restructuredtext" }
Write RST content with citations:
This is a paragraph with citations :cite:p:`amari1977dynamics,wu2008dynamics`.
Add a bibliography directive at the end of the notebook (in another raw RST cell):
References ---------- .. bibliography:: :cited: :style: alpha
Citation Styles:
:cite:p:`key`- Parenthetical: (Author, Year) - entire citation is clickable:cite:t:`key`- Textual: Author [Year] - only year is clickable
Example: See docs/en/0_why_canns.ipynb for a complete working example.
Adding New References¶
To add new references to the bibliography:
Open
docs/refs/references.bibAdd your BibTeX entry following the existing format
Use a consistent citation key format:
authorYEARkeyword(e.g.,wu2008dynamics)Cite the reference using
:cite:`citationkey`The reference will automatically appear in this bibliography
For more information, see the sphinxcontrib-bibtex documentation.