参考文献

本页列出了 CANNs 文档中引用的所有参考文献。

Note

要在文档或笔记本中引用这些参考文献,请使用 :cite: 角色。 例如::cite:`wu2008dynamics` 渲染为 [Wu08]。

完整参考文献列表

[1]

John O'Keefe and Jonathan Dostrovsky. The hippocampus as a spatial map: preliminary evidence from unit activity in the freely-moving rat. Brain research, 1971. doi:10.1016/0006-8993(71)90358-1.

[2]

Torkel Hafting, Marianne Fyhn, Sturla Molden, May-Britt Moser, and Edvard I Moser. Microstructure of a spatial map in the entorhinal cortex. Nature, 436(7052):801–806, 2005. doi:10.1038/nature03721.

[3]

Jeffrey S Taube, Robert U Muller, and James B Ranck. Head-direction cells recorded from the postsubiculum in freely moving rats. i. description and quantitative analysis. Journal of Neuroscience, 10(2):420–435, 1990. doi:10.1523/JNEUROSCI.10-02-00420.1990.

[4]

Bruce L McNaughton, Francesco P Battaglia, Ole Jensen, Edvard I Moser, and May-Britt Moser. Path integration and the neural basis of the'cognitive map'. Nature Reviews Neuroscience, 7(8):663–678, 2006. doi:10.1038/nrn1932.

[5]

Shun-ichi Amari. Dynamics of pattern formation in lateral-inhibition type neural fields. Biological cybernetics, 27(2):77–87, 1977. doi:10.1007/BF00337259.

[6]

Si Wu, Kosuke Hamaguchi, and Shun-ichi Amari. Dynamics and computation of continuous attractors. Neural computation, 20(4):994–1025, 2008. doi:10.1162/neco.2008.10-06-378.

[7]

CC Alan Fung, KY Michael Wong, and Si Wu. A moving bump in a continuous manifold: a comprehensive study of the tracking dynamics of continuous attractor neural networks. Neural Computation, 22(3):752–792, 2010. doi:10.1162/neco.2009.07-08-824.

[8]

Si Wu, KY Michael Wong, CC Alan Fung, Yuanyuan Mi, and Wenhao Zhang. Continuous attractor neural networks: candidate of a canonical model for neural information representation. F1000Research, 5:F1000–Faculty, 2016. doi:10.12688/f1000research.7387.1.

[9]

Yuanyuan Mi, CC Fung, KY Wong, and Si Wu. Spike frequency adaptation implements anticipative tracking in continuous attractor neural networks. Advances in neural information processing systems, 2014.

[10]

Yujun Li, Tianhao Chu, and Si Wu. Dynamics of continuous attractor neural networks with spike frequency adaptation. Neural Computation, 37(6):1057–1101, 2025. doi:10.1162/neco_a_01757.

[11]

Zilong Ji, Tianhao Chu, Si Wu, and Neil Burgess. A systems model of alternating theta sweeps via firing rate adaptation. Current Biology, 35(4):709–722, 2025. doi:10.1016/j.cub.2024.08.059.

[12]

Yingxue Wang, Sandro Romani, Brian Lustig, Anthony Leonardo, and Eva Pastalkova. Theta sequences are essential for internally generated hippocampal firing fields. Nature neuroscience, 18(2):282–288, 2015. doi:10.1038/nn.3904.

[13]

Zilong Ji, Eleonora Lomi, Kate Jeffery, Anna S Mitchell, and Neil Burgess. Phase precession relative to turning angle in theta-modulated head direction cells. Hippocampus, 35(2):e70008, 2025. doi:10.1002/hipo.70008.

[14]

Tianhao Chu, Zilong Ji, Junfeng Zuo, Yuanyuan Mi, Wen-hao Zhang, Tiejun Huang, Daniel Bush, Neil Burgess, and Si Wu. Firing rate adaptation affords place cell theta sweeps, phase precession, and procession. Elife, 12:RP87055, 2024. doi:10.7554/eLife.87055.4.

[15]

Gunnar Carlsson. Topology and data. Bulletin of the American Mathematical Society, 46(2):255–308, 2009. doi:10.1090/S0273-0979-09-01249-X.

[16]

Herbert Edelsbrunner and John Harer. Computational topology: an introduction. American Mathematical Soc., 2010.

[17]

James Bradbury, Roy Frostig, Peter Hawkins, Matthew James Johnson, Chris Leary, Dougal Maclaurin, George Necula, Adam Paszke, Jake VanderPlas, Skye Wanderman-Milne, and Qiao Zhang. JAX: composable transformations of Python+NumPy programs. 2018. URL: http://github.com/jax-ml/jax.

[18]

Chaoming Wang, Tianqiu Zhang, Xiaoyu Chen, Sichao He, Shangyang Li, and Si Wu. Brainpy, a flexible, integrative, efficient, and extensible framework for general-purpose brain dynamics programming. elife, 12:e86365, 2023. doi:10.7554/eLife.86365.

[19]

Tianhao Chu, Yuling Wu, Wentao Qiu, Zihao Jiang, Neil Burgess, Bo Hong, and Si Wu. Localized space coding and phase coding complement each other to achieve robust and efficient spatial representation. bioRxiv, pages 2025–09, 2025.

[20]

Yoram Burak and Ila R Fiete. Accurate path integration in continuous attractor network models of grid cells. PLoS computational biology, 5(2):e1000291, 2009.

[21]

John J Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8):2554–2558, 1982. doi:10.1073/pnas.79.8.2554.

[22]

Donald Olding Hebb. The organization of behavior: A neuropsychological theory. Psychology press, 2005.

[23]

S-I Amari. Neural theory of association and concept-formation. Biological cybernetics, 26(3):175–185, 1977. doi:10.1007/BF00365229.

[24]

Guo-qiang Bi and Mu-ming Poo. Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. Journal of neuroscience, 18(24):10464–10472, 1998. doi:10.1523/JNEUROSCI.18-24-10464.1998.

[25]

Erkki Oja. Simplified neuron model as a principal component analyzer. Journal of mathematical biology, 15(3):267–273, 1982. doi:10.1007/BF00275687.

[26]

Terence D Sanger. Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural networks, 2(6):459–473, 1989. doi:10.1016/0893-6080(89)90044-0.

[27]

Ariane S Etienne and Kathryn J Jeffery. Path integration in mammals. Hippocampus, 14(2):180–192, 2004. doi:10.1002/hipo.10173.

[28]

Alexei Samsonovich and Bruce L McNaughton. Path integration and cognitive mapping in a continuous attractor neural network model. Journal of Neuroscience, 17(15):5900–5920, 1997. doi:10.1523/JNEUROSCI.17-15-05900.1997.

[29]

David Sussillo and Omri Barak. Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks. Neural computation, 25(3):626–649, 2013. doi:10.1162/NECO_a_00409.

[30]

Matthew D Golub and David Sussillo. Fixedpointfinder: a tensorflow toolbox for identifying and characterizing fixed points in recurrent neural networks. Journal of open source software, 3(31):1003, 2018. doi:10.21105/joss.01003.

[31]

Elie L Bienenstock, Leon N Cooper, and Paul W Munro. Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. Journal of Neuroscience, 2(1):32–48, 1982. doi:10.1523/JNEUROSCI.02-01-00032.1982.

[32]

John O'Keefe and Michael L Recce. Phase relationship between hippocampal place units and the eeg theta rhythm. Hippocampus, 3(3):317–330, 1993. doi:10.1002/hipo.450030307.

[33]

Sichao He. CANNs: continuous attractor neural networks toolkit. 2025. URL: https://github.com/Routhleck/canns, doi:10.5281/zenodo.17412545.

如何引用文献

在 RST 文件中

在文本中使用 :cite: 角色:

连续吸引子的动力学由 :cite:`wu2008dynamics` 分析。
基础性工作包括 :cite:`amari1977dynamics`:cite:`wu2016continuous`

在 Jupyter 笔记本中

重要提示:在 Jupyter 笔记本中,您必须使用 raw 单元格,并设置为 reStructuredText 格式,而不是 markdown 单元格。

  1. 创建一个 raw 单元格(Cell → Cell Type → Raw)

  2. 设置单元格元数据以指示 RST 格式:

    {
      "raw_mimetype": "text/restructuredtext"
    }
    
  3. 编写包含引用的 RST 内容:

    这是一个包含引用的段落 :cite:p:`amari1977dynamics,wu2008dynamics`.
    
  4. 在笔记本末尾添加参考文献列表指令(在另一个 raw RST 单元格中):

    参考文献
    --------
    
    .. bibliography::
       :cited:
       :style: alpha
    

引用样式

  • :cite:p:`key` - 括号式引用:(作者,年份) - 整个引用都可点击

  • :cite:t:`key` - 文本式引用:作者 [年份] - 只有年份可点击

示例:查看 docs/en/0_why_canns.ipynb 获取完整的工作示例。

添加新参考文献

要向文献库添加新参考文献:

  1. 打开 docs/refs/references.bib

  2. 按照现有格式添加您的 BibTeX 条目

  3. 使用一致的引用键格式:作者年份关键词``(例如 ``wu2008dynamics

  4. 使用 :cite:`引用键` 引用参考文献

  5. 参考文献将自动出现在此文献列表中

更多信息请参阅 sphinxcontrib-bibtex 文档