# Библиография

[1] D. V. Buonomano and T. P. Carvalho, “Spike-Timing-Dependent Plasticity (STDP),” Encycl. Neurosci., pp. 265–268, 2009.

[2] S. A. Lobov, “Generalized Memory of STDP-Driven Spiking Neural Network,” Math. Biol. Bioinforma., vol. 14, no. 2, pp. 649–664, 2019.

[3] S. Hochreiter, “Long Short-Term Memory,” vol. 1780, pp. 1735–1780, 1997.

[4] P. Poirazi and B. W. Mel, “Impact of active dendrites and structural plasticity on the memory capacity of neural tissue,” Neuron, vol. 29, no. 3, pp. 779–796, 2001.

[5] F. T. Sommer and T. Wennekers, “Associative memory in networks of spiking neurons,” Neural Networks, vol. 14, no. 6–7, pp. 825–834, 2001.

[6] C. R. Laing and C. C. Chow, “Stationary bumps in networks of spiking neurons,” Neural Comput., vol. 13, no. 7, pp. 1473–1494, 2001.

[7] G. Mongillo, O. Barak, and M. Tsodyks, “SynaptiC Theory of Working Memory,” Science (80-. )., vol. 319, no. 5869, pp. 1543–1546, 2008.

[8] F. Zenke, E. J. Agnes, and W. Gerstner, “Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks,” Nat. Commun., vol. 6, 2015.

[9] P. Poirazi and B. W. Mel, “Impact of active dendrites and structural plasticity on the memory capacity of neural tissue,” Neuron, vol. 29, no. 3, pp. 779–796, 2001.

[10] F. T. Sommer and T. Wennekers, “Associative memory in networks of spiking neurons,” Neural Networks, vol. 14, no. 6–7, pp. 825–834, 2001.

[11] G. Mongillo, O. Barak, and M. Tsodyks, “SynaptiC Theory of Working Memory,” Science (80-. )., vol. 319, no. 5869, pp. 1543–1546, Mar. 2008.

[12] F. Zenke, E. J. Agnes, and W. Gerstner, “Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks,” Nat. Commun., vol. 6, Apr. 2015.

[13] G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass, “Long short-term memory and learning-to-learn in networks of spiking neurons,” in Advances in Neural Information Processing Systems, 2018, vol. 2018-Decem, pp. 787–797.

[14] R. Brette and W. Gerstner, “Adaptive exponential integrate-and-fire model as an effective description of neuronal activity,” J. Neurophysiol., vol. 94, no. 5, pp. 3637–3642, 2005.

[15] Q. Yu, H. Tang, K. C. Tan, and H. Yu, “A brain-inspired spiking neural network model with temporal encoding and learning,” Neurocomputing, vol. 138, pp. 3–13, Aug. 2014.

[16] J. Malmivuo and R. Plonsey, “Bioelectromagnetism Principles and Applications of Bioelectric,” Oxford Univ. Press, p. 512, 1995.

[17] S. Williams, “Visual Arctic navigation: Techniques for autonomous agents in glacial environments,” ProQuest Diss. Theses, vol. 3484159, no. August, p. 177, 2011.

[18] W. Gerstner, M. Lehmann, V. Liakoni, D. Corneil, and J. Brea, “Eligibility Traces and Plasticity on Behavioral Time Scales: Experimental Support of NeoHebbian Three-Factor Learning Rules,” Frontiers in Neural Circuits, vol. 12. 2018.

[19] J. Wu, Y. Chua, M. Zhang, Q. Yang, G. Li, and H. Li, “Deep Spiking Neural Network with Spike Count based Learning Rule,” in Proceedings of the International Joint Conference on Neural Networks, 2019, vol. 2019-July.

[20] L. R. Squire, “Memory systems of the brain: A brief history and current perspective,” Neurobiol. Learn. Mem., vol. 82, no. 3, pp. 171–177, 2004.

[21] Q. Yu, H. Tang, J. Hu, and K. C. Tan, “A spiking neural network system for robust sequence recognition,” Intell. Syst. Ref. Libr., 2017.

[22] M. Mozafari, M. Ganjtabesh, A. Nowzari-Dalini, S. J. Thorpe, and T. Masquelier, “Bio-inspired digit recognition using reward-modulated spike-timing-dependent plasticity in deep convolutional networks,” Pattern Recognit., 2019.

[23] P. Martínez-Cañada, C. Morillas, B. Pino, E. Ros, and F. Pelayo, “A Computational Framework for Realistic Retina Modeling,” Int. J. Neural Syst., vol. 26, no. 7, pp. 1–18, 2016.

[24] N. Melanitis and K. S. Nikita, “Biologically-inspired image processing in computational retina models,” Comput. Biol. Med., vol. 113, no. August, p. 103399, 2019.

[25] M. Hopkins, G. Pineda-Garcia, P. A. Bogdan, and S. B. Furber, “Spiking neural networks for computer vision,” Interface Focus, vol. 8, no. 4, 2018.

[26] S. Mohapatra, H. Gotzig, S. Yogamani, S. Milz, and R. Zöllner, “Exploring deep spiking neural networks for automated driving applications,” in VISIGRAPP 2019 - Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, 2019, vol. 5.

[27] T. Tim, S. Mohammed, S. Mohammed, and R. Hassan, “Genetic Algorithm for Neural Network Architecture Optimization Modelling and Scientific,” pp. 1–4, 2006.

[28] Y. Sun, B. Xue, M. Zhang, and G. G. Yen, “Automatically Designing CNN Architectures Using Genetic Algorithm for Image Classification,” pp. 1–12, 2018.

[29] G. Marcus, “Deep Learning: A Critical Appraisal,” pp. 1–27, 2018.

[30] C. D. Schuman et al., “A Survey of Neuromorphic Computing and Neural Networks in Hardware,” pp. 1–88, 2017.

[31] M. R. Minar and J. Naher, “Recent Advances in Deep Learning: An Overview,” no. February, pp. 0–31, 2018.

[32] J. Schmidhuber, “Deep Learning in neural networks: An overview,” Neural Networks, vol. 61. pp. 85–117, 2015.

[33] “IEEE Xplore Full-Text PDF:” [Online]. Available: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8697857. [Accessed: 25-Nov-2019].

[34] A. Shrestha and A. Mahmood, “Review of deep learning algorithms and architectures,” IEEE Access, vol. 7. Institute of Electrical and Electronics Engineers Inc., pp. 53040–53065, 2019.

[35] J. Guerguiev, T. P. Lillicrap, and B. A. Richards, “Towards deep learning with segregated dendrites,” Elife, vol. 6, pp. 1–37, 2017.

[36] M. J. Rozenberg, O. Schneegans, and P. Stoliar, “An ultra-compact leaky-integrate-and-fire model for building spiking neural networks,” Sci. Rep., vol. 9, no. 1, p. 11123, 2019.

[37] S. G. Wysoski, L. Benuskova, and N. Kasabov, “On-line learning with structural adaptation in a network of spiking neurons for visual pattern recognition,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 4131 LNCS, pp. 61–70, 2006.

[38] A. Sboev, D. Vlasov, R. Rybka, and A. Serenko, “Solving a classification task by spiking neurons with STDP and temporal coding,” Procedia Comput. Sci., 2018.

[39] H. Paugam-Moisy, “Spiking Neuron Networks A survey,” Idiapch, 2006.

[40] F. Alnajjar and K. Murase, “A simple aplysia-like spiking neural network to generate adaptive behavior in autonomous robots,” Adapt. Behav., 2008.

[41] J. Wang, A. Belatreche, L. P. Maguire, and T. M. McGinnity, “Spikecomp: An evolving spiking neural network with adaptive compact structure for pattern classification,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), 2015.

[42] R. Kozma and M. Kitamura, “Dynamic structure adaptation in feed-forward neural networks - an example of plant monitoring,” IEEE Int. Conf. Neural Networks - Conf. Proc., vol. 2, pp. 692–697, 1995.

[43] C. Cortes, X. Gonzalvo, V. Kuznetsov, M. Mohri, and S. Yang, “AdaNet: Adaptive structural learning of artificial neural networks,” 34th Int. Conf. Mach. Learn. ICML 2017, vol. 2, pp. 1452–1466, 2017.

[44] S. Yadav and A. Sood, “Adaptation in Neural Networks : A Review,” vol. 2, no. 11, pp. 3278–3281, 2013.

[45] J. Wang, A. Belatreche, L. Maguire, and T. M. McGinnity, “An online supervised learning method for spiking neural networks with adaptive structure,” Neurocomputing, 2014.

[46] A. M. Zador, “A critique of pure learning and what artificial neural networks can learn from animal brains,” Nat. Commun., vol. 10, no. 1, 2019.

[47] A. Gaier and D. Ha, “Weight Agnostic Neural Networks,” no. NeurIPS, pp. 1–19, 2019.

[48] S. Deneve, “Bayesian spiking neurons II: Learning,” Neural Comput., 2008.

[49] B. Gardner, I. Sporea, and A. Grüning, “Learning spatiotemporally encoded pattern transformations in structured spiking neural networks,” Neural Computation, vol. 27, no. 12. MIT Press Journals, pp. 2548–2586, 01-Dec-2015.

[50] C. D. James et al., “A historical survey of algorithms and hardware architectures for neural-inspired and neuromorphic computing applications,” Biologically Inspired Cognitive Architectures, vol. 19. pp. 49–64, 2017.

[51] G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass, “Long short-term memory and learning-to-learn in networks of spiking neurons,” in Advances in Neural Information Processing Systems, 2018, vol. 2018-December.

[52] S. K. Esser, R. Appuswamy, P. A. Merolla, J. V. Arthur, and D. S. Modha, “Backpropagation for energy-efficient neuromorphic computing,” in Advances in Neural Information Processing Systems, 2015, vol. 2015-January.

[53] J. C. Thiele, O. Bichler, and A. Dupret, “Event-based, timescale invariant unsupervised online deep learning with STDP,” Front. Comput. Neurosci., vol. 12, 2018.

[54] Y. Cao, Y. Chen, and D. Khosla, “Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition,” Int. J. Comput. Vis., vol. 113, no. 1, 2015.

[55] G. Haessig, A. Cassidy, R. Alvarez, R. Benosman, and G. Orchard, “Spiking Optical Flow for Event-Based Sensors Using IBM’s TrueNorth Neurosynaptic System,” in IEEE Transactions on Biomedical Circuits and Systems, 2018, vol. 12, no. 4, pp. 860–870.

[56] W. Nicola and C. Clopath, “Supervised learning in spiking neural networks with FORCE training,” Nat. Commun., vol. 8, no. 1, 2017.

[57] P. Sharma and A. Singh, “Era of deep neural networks: A review,” in 8th International Conference on Computing, Communications and Networking Technologies, ICCCNT 2017, 2017.

[58] T. Gorach, “Deep Convolutional Neural Networks - A Review,” Int. Res. J. Eng. Technol., vol. 56, no. 5, pp. 1235–1250, 2018.

[59] C. A. Mitrea, M. G. Constantin, L. D. Ştefan, M. Ghenescu, and B. Ionescu, “Little-Big Deep Neural Networks for Embedded Video Surveillance,” in 2018 12th International Conference on Communications, COMM 2018 - Proceedings, 2018, pp. 493–496.

[60] J. Schmidhuber, “(Review) Deep Learning in neural networks,” Neural Networks, pp. 1–88, 2015.

[61] N. Aloysius and M. Geetha, “A review on deep convolutional neural networks,” in Proceedings of the 2017 IEEE International Conference on Communication and Signal Processing, ICCSP 2017, 2018, vol. 2018-Janua, pp. 588–592.

[62] C. A. Mitrea, M. G. Constantin, L. D. Ştefan, M. Ghenescu, and B. Ionescu, “Little-Big Deep Neural Networks for Embedded Video Surveillance,” in 2018 12th International Conference on Communications, COMM 2018 - Proceedings, 2018, pp. 493–496.

[63] I. Vasilyev, V. Leontyev, and S. Polovko, “Modeling the management of a service underwater vehicle while maneuvering in an environment with obstacles,” {IOP} Conf. Ser. Earth Environ. Sci., vol. 302, p. 12058, 2019.

[64] D. Stepanov, A. Popov, D. Gromoshinskii, and O. Shmakov, “Visual-inertial sensor fusion to accuracy increase of autonomous underwater vehicles positioning,” Ann. DAAAM Proc. Int. DAAAM Symp., vol. 29, no. 1, pp. 0615–0623, 2018.

[65] P. U. Diehl and M. Cook, “Unsupervised learning of digit recognition using spike-timing-dependent plasticity,” Front. Comput. Neurosci., vol. 9, no. AUGUST, 2015.

[66] A. Tavanaei, T. Masquelier, and A. Maida, “Representation learning using event-based STDP,” Neural Networks, vol. 105, 2018.

[67] Y. Sheng, Y. Wang, L. Wang, and G. Zhao, “A comparison of learning rules in pulse-based neural networks,” in 2016 13th International Computer Conference on Wavelet Active Media Technology and Information Processing, ICCWAMTIP 2017, 2017, pp. 95–98.

[68] Y. Xu, X. Zeng, L. Han, and J. Yang, “A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks,” Neural Networks, 2013.

[69] X. She, Y. Long, and S. Mukhopadhyay, “Fast and Low-Precision Learning in GPU-Accelerated Spiking Neural Network,” in Proceedings of the 2019 Design, Automation and Test in Europe Conference and Exhibition, DATE 2019, 2019.

[70] K. Ahmed, A. Shrestha, Y. Wang, and Q. Qiu, “System design for in-hardware STDP learning and spiking based probablistic inference,” in Proceedings of IEEE Computer Society Annual Symposium on VLSI, ISVLSI, 2016.

[71] M. Mozafari, S. R. Kheradpisheh, T. Masquelier, A. Nowzari-Dalini, and M. Ganjtabesh, “First-spike-based visual categorization using reward-modulated STDP,” IEEE Trans. Neural Networks Learn. Syst., 2018.

[72] X. Lin, X. Wang, and Z. Hao, “Supervised learning in multilayer spiking neural networks with inner products of spike trains,” Neurocomputing, 2017.

[73] N. Yusoff, F. Kabir Ahmad, N. ChePa, and A. Ab Aziz, “Learning stimulus-stimulus association in spatiotemporal neural networks,” J. Teknol., 2015.

[74] A. V. Bakhshiev and F. V. Gundelakh, “Mathematical model of the impulses transformation processes in natural neurons for biologically inspired control systems development,” in CEUR Workshop Proceedings, 2015, vol. 1452, pp. 1–12.

[75] A. V. Bakhshiev and F. V. Gundelakh, The model of the robot’s hierarchical behavioral control system, vol. 9719. 2016.

[76] D. Stepanov, A. Bakhshiev, D. Gromoshinskii, N. Kirpan, and F. Gundelakh, Determination of the relative position of space vehicles by detection and tracking of natural visual features with the existing tv-cameras, vol. 542. 2015.

[77] I. S. Fomin, A. V. Bakhshiev, and D. A. Gromoshinskii, “Study of Using Deep Learning Nets for Mark Detection in Space Docking Control Images,” in Procedia Computer Science, 2017, vol. 103, pp. 59–66.

[78] A. M. Korsakov, I. S. Fomin, D. A. Gromoshinsky, A. V. Bakhshiev, D. N. Stepanov, and E. Smirnova, “Determination of an unmanned mobile object orientation by natural landmarks,” in CEUR Workshop Proceedings, 2016, vol. 1710, pp. 91–101.

[79] I. Fomin, V. Mikhailov, A. Bakhshiev, N. Merkulyeva, A. Veshchitskii, and P. Musienko, Detection of Neurons on Images of the Histological Slices Using Convolutional Neural Network, vol. 736. 2018.

[80] V. V. Mikhaylov and A. V. Bakhshiev, “The System for Histopathology Images Analysis of Spinal Cord Slices,” in Procedia Computer Science, 2017, vol. 103, pp. 239–243.

[81] A. Bakhshiev and L. Stankevich, Prospects for the Development of Neuromorphic Systems, vol. 736. 2018.

[82] N. Filatov, V. Vlasenko, I. Fomin, and A. Bakhshiev, Application of deep neural network for the vision system of mobile service robot, vol. 856. 2020.

[83] I. Fomin, D. Gromoshinskii, and A. Bakhshiev, Object Detection on Images in Docking Tasks Using Deep Neural Networks, vol. 736. 2018.

[84] I. S. Fomin and A. V. Bakhshiev, Research on convolutional neural network for object classification in outdoor video surveillance system, vol. 856. 2020.

[85] A. V. Bakhshiev and F. V. Gundelakh, “Application the Spiking Neuron Model with Structural Adaptation to Describe Neuromorphic Systems,” in Procedia Computer Science, 2017, vol. 103, pp. 190–197.

[86] I. S. Fomin, S. R. Orlova, D. A. Gromoshinskii, and A. V. Bakhshiev, Object detection on docking images with deep convolutional network, vol. 799. 2019.

[87] J. Wang, A. Belatreche, L. P. Maguire, and T. M. McGinnity, “SpikeTemp: An Enhanced Rank-Order-Based Learning Approach for Spiking Neural Networks with Adaptive Structure,” IEEE Trans. Neural Networks Learn. Syst., vol. 28, no. 1, pp. 30–43, Jan. 2017.

[88] T. P. Lillicrap, D. Cownden, D. B. Tweed, and C. J. Akerman, “Random synaptic feedback weights support error backpropagation for deep learning,” Nat. Commun., vol. 7, 2016.

[89] Y. Jin, W. Zhang, and P. Li, “Hybrid macro/micro level backpropagation for training deep spiking neural networks,” in Advances in Neural Information Processing Systems, 2018, vol. 2018-December.

[90] P. Ferré, F. Mamalet, and S. J. Thorpe, “Unsupervised feature learning with winner-takes-all based STDP,” Front. Comput. Neurosci., vol. 12, 2018.

[91] J. Wang, A. Belatreche, L. P. Maguire, and T. M. McGinnity, “SpikeTemp: An Enhanced Rank-Order-Based Learning Approach for Spiking Neural Networks with Adaptive Structure,” IEEE Trans. Neural Networks Learn. Syst., vol. 28, no. 1, pp. 30–43, 2017.

[92] H. Mostafa, “Supervised learning based on temporal coding in spiking neural networks,” IEEE Trans. Neural Networks Learn. Syst., vol. 29, no. 7, 2018.

[93] J. Kim, H. Kim, S. Huh, J. Lee, and K. Choi, “Deep neural networks with weighted spikes,” Neurocomputing, vol. 311, 2018.

[94] M. Bouvier et al., “Spiking neural networks hardware implementations and challenges: A survey,” ACM Journal on Emerging Technologies in Computing Systems, vol. 15, no. 2. 2019.

[95] Y. Wu, L. Deng, G. Li, J. Zhu, and L. Shi, “Spatio-temporal backpropagation for training high-performance spiking neural networks,” Front. Neurosci., vol. 12, no. MAY, 2018.

[96] G. Orchard, A. Jayawant, G. K. Cohen, and N. Thakor, “Converting static image datasets to spiking neuromorphic datasets using saccades,” Front. Neurosci., vol. 9, no. NOV, 2015.

[97] H. Hazan et al., “BindsNET: A machine learning-oriented spiking neural networks library in python,” Front. Neuroinform., vol. 12, 2018.

[98] D. Huh and T. J. Sejnowski, “Gradient descent for spiking neural networks,” in Advances in Neural Information Processing Systems, 2018, vol. 2018-December.

[99] F. Zenke and S. Ganguli, “SuperSpike: Supervised learning in multilayer spiking neural networks,” Neural Comput., vol. 30, no. 6, 2018.

[100] B. Gardner, I. Sporea, and A. Grüning, “Learning spatiotemporally encoded pattern transformations in structured spiking neural networks,” Neural Computation, vol. 27, no. 12. 2015.

[101] B. Rueckauer, I. A. Lungu, Y. Hu, M. Pfeiffer, and S. C. Liu, “Conversion of continuous-valued deep networks to efficient event-driven networks for image classification,” Front. Neurosci., vol. 11, no. DEC, 2017.

[102] A. Taherkhani, A. Belatreche, Y. Li, and L. P. Maguire, “A Supervised Learning Algorithm for Learning Precise Timing of Multiple Spikes in Multilayer Spiking Neural Networks,” IEEE Trans. Neural Networks Learn. Syst., vol. 29, no. 11, 2018.

[103] P. Panda and K. Roy, “Unsupervised regenerative learning of hierarchical features in Spiking Deep Networks for object recognition,” in Proceedings of the International Joint Conference on Neural Networks, 2016, vol. 2016-October.

[104] A. Tavanaei, M. Ghodrati, S. R. Kheradpisheh, T. Masquelier, and A. Maida, “Deep learning in spiking neural networks,” Neural Networks, vol. 111. pp. 47–63, 2019.

[105] S. R. Kheradpisheh, M. Ganjtabesh, S. J. Thorpe, and T. Masquelier, “STDP-based spiking deep convolutional neural networks for object recognition,” Neural Networks, vol. 99, 2018.

[106] A. Tavanaei and A. Maida, “BP-STDP: Approximating backpropagation using spike timing dependent plasticity,” Neurocomputing, vol. 330, 2019.

[107] O. Rhodes et al., “Spynnaker: A software package for running pynn simulations on spinnaker,” Front. Neurosci., vol. 12, no. NOV, 2018.

[108] C. Lee, G. Srinivasan, P. Panda, and K. Roy, “Deep Spiking Convolutional Neural Network Trained with Unsupervised Spike Timing Dependent Plasticity,” IEEE Transactions on Cognitive and Developmental Systems, 2018.

[109] S. R. Kheradpisheh, M. Ganjtabesh, and T. Masquelier, “Bio-inspired unsupervised learning of visual features leads to robust invariant object recognition,” Neurocomputing, vol. 205, 2016.

[110] J. Wu, Y. Chua, M. Zhang, H. Li, and K. C. Tan, “A spiking neural network framework for robust sound classification,” Front. Neurosci., vol. 12, no. NOV, 2018.

[111] H. Z. Ghwhfwlrq, R. Wudfnlqj, and Y. Vxuyhloodqfh, “A Study on Video Surveillance System for Object Detection and Tracking,” Int. Conf. Comput. Sustain. Glob. Dev., no. April, pp. 221–226, 2016.

[112] V. C. Banu, I. M. Costea, F. C. Nemtanu, and I. Bǎdescu, “Intelligent video surveillance system,” 2017 IEEE 23rd Int. Symp. Des. Technol. Electron. Packag. SIITME 2017 - Proc., vol. 2018-Janua, pp. 208–212, 2018.

[113] J. Chen, K. Li, Q. Deng, K. Li, and P. S. Yu, “Distributed Deep Learning Model for Intelligent Video Surveillance Systems with Edge Computing,” IEEE Trans. Ind. Informatics, vol. PP, no. c, pp. 1–1, 2019.

[114] G. A. Hembury, V. V. Borovkov, J. M. Lintuluoto, and Y. Inoue, “Deep Residual Learning for Image Recognition Kaiming,” Chem. Lett., vol. 32, no. 5, pp. 428–429, 2003.

[115] K. Muralidharan, “Deep Neural Networks for Video Surveillance : a Review,” vol. 7, no. 4, pp. 102–108.

[116] B. Singh, D. Singh, G. Singh, N. Sharma, and V. Sibbal, “Motion detection for video surveillance,” 2014 Int. Conf. Signal Propag. Comput. Technol. ICSPCT 2014, no. March 2017, pp. 578–584, 2014.

[117] C. Kim, J. Lee, T. Han, and Y. M. Kim, “A hybrid framework combining background subtraction and deep neural networks for rapid person detection,” J. Big Data, vol. 5, no. 1, 2018.

[118] K. Simonyan and A. Zisserman, “Very Deep Convolutional Networks for Large-Scale Image Recognition,” pp. 1–14, 2014.

[119] S. V. Adams et al., “Behavioral Learning in a Cognitive Neuromorphic Robot: An Integrative Approach,” IEEE Trans. Neural Networks Learn. Syst., vol. 29, no. 12, pp. 6132–6144, 2018.

[120] C. Lee, S. S. Sarwar, and K. Roy, “Enabling Spike-based Backpropagation in State-of-the-art Deep Neural Network Architectures,” 2019.

[121] M. P. Fok, Y. Tian, D. Rosenbluth, and P. R. Prucnal, “Asynchronous spiking photonic neuron for lightwave neuromorphic signal processing,” Opt. Lett., vol. 37, no. 16, p. 3309, 2012.

[122] A. Jeyasothy, S. Sundaram, and N. Sundararajan, “SEFRON: A New Spiking Neuron Model With Time-Varying Synaptic Efficacy Function for Pattern Classification,” IEEE Transactions on Neural Networks and Learning Systems, vol. PP, IEEE, pp. 1–10, 2018.

[123] S. C. Liu and T. Delbruck, “Neuromorphic sensory systems,” Current Opinion in Neurobiology, vol. 20, no. 3. Elsevier Ltd, pp. 288–295, 2010.

[124] D. Rosenbluth et al., “Signal feature recognition based on lightwave neuromorphic signal processing,” Opt. Lett., vol. 36, no. 1, p. 19, 2010.

[125] G. Indiveri and S. C. Liu, “Memory and Information Processing in Neuromorphic Systems,” Proceedings of the IEEE, vol. 103, no. 8. pp. 1379–1397, 2015.

[126] M. J. Pearson et al., “Implementing spiking neural networks for real-time signal-processing and control applications: A model-validated FPGA approach,” IEEE Trans. Neural Networks, vol. 18, no. 5, pp. 1472–1487, 2007.

[127] G. Orchard, R. Benosman, R. Etienne-Cummings, and N. V. Thakor, “A spiking neural network architecture for visual motion estimation,” in 2013 IEEE Biomedical Circuits and Systems Conference, BioCAS 2013, 2013, no. September 2014, pp. 298–301.

[128] A. S. Miller, B. H. Blott, and T. K. Hames, “Review of neural network applications in medical imaging and signal processing,” Medical & Biological Engineering & Computing, vol. 30, no. 5. pp. 449–464, 1992.

[129] B. J. Shastri, A. N. Tait, M. A. Nahmias, and P. R. Prucnal, “Photonic spike processing: ultrafast laser neurons and an integrated photonic network,” vol. 32, no. 21, pp. 3427–3439, 2014.

[130] H. Hazan, D. Saunders, D. T. Sanghavi, H. Siegelmann, and R. Kozma, “Unsupervised Learning with Self-Organizing Spiking Neural Networks,” in Proceedings of the International Joint Conference on Neural Networks, 2018, vol. 2018-July.

[131] A. N. Tait, Nanophotonic Information Physics. 2014.

[132] J. Ebbers, J. Heitkaemper, J. Schmalenstroeer, and R. Haeb-Umbach, “Benchmarking Neural Network Architectures for Acoustic Sensor Networks,” in Proc. of ITG Fachtagung Sprachkommunikation (Speech Communications), 2018, no. October.

[133] G. Srinivasan, P. Panda, and K. Roy, “STDP-based Unsupervised Feature Learning using Convolution-over-time in Spiking Neural Networks for Energy-Efficient Neuromorphic Computing,” ACM J. Emerg. Technol. Comput. Syst., vol. 14, no. 4, pp. 1–12, 2018.

[134] B. D. Hoskins et al., “Streaming Batch Eigenupdates for Hardware Neuromorphic Networks,” pp. 1–13, 2019.

[135] C. Lee, P. Panda, G. Srinivasan, and K. Roy, “Training deep spiking convolutional Neural Networks with STDP-based unsupervised pre-training followed by supervised fine-tuning,” Front. Neurosci., vol. 12, no. AUG, 2018.

[136] S. Chetlur et al., “cuDNN: Efficient Primitives for Deep Learning,” pp. 1–9, 2014.

[137] D. J. Saunders, H. T. Siegelmann, R. Kozma, and M. Ruszinkao, “STDP Learning of Image Patches with Convolutional Spiking Neural Networks,” in Proceedings of the International Joint Conference on Neural Networks, 2018, vol. 2018-July, no. October.

[138] O. Committee, “Neuromorphic Computing Architectures , Models , and Applications A Beyond-CMOS Approach to Future Computing.”

[139] “Разработка искусственных когнитивных систем на основе моделей мозга живых организмов,” pp. 1156–1171, 2014.

[140] D. Zhu et al., “Design and Hardware Implementation of Neuromorphic Systems With RRAM Synapses and Threshold-Controlled Neurons for Pattern Recognition,” IEEE Trans. Circuits Syst. I Regul. Pap., vol. 65, no. 9, pp. 2726–2738, 2018.

[141] G. Indiveri and S. C. Liu, “Memory and Information Processing in Neuromorphic Systems,” Proceedings of the IEEE, vol. 103, no. 8. pp. 1379–1397, 2015.

[142] R. Kurino, M. Sugisaka, and K. Shibata, “Growing neural network with hidden neurons,” Proc. 9th Int. Symp. Artif. Lifeand Robot., vol. 1, no. November, 2015.

[143] I. K. Schuller and R. Stevens, “Neuromorphic Computing: From Materials to Systems Architecture - Report of a Roundtable Convened to Consider Neuromorphic Computing Basic Research Needs,” 2015.

[144] A. G. Andreou, “Part I : Neuromorphic architectures,” pp. 1–25, 2009.

[145] N. Srinivasa, J. Cruz-Albrecht, and Y. Cho, “Cortical neuromorphic network, system and method,” 2015.

[146] J. S. Gans, “Self-Regulating Artificial General Intelligence,” 2018.

[147] K. Khalil, O. Eldash, A. Kumar, and M. Bayoumi, “An Efficient Approach for Neural Network Architecture,” 2018 25th IEEE Int. Conf. Electron. Circuits Syst., pp. 745–748, 2019.

[148] C. D. Schuman, O. Ridge, and A. Disney, “Dynamic Adaptive Neural Network Arrays : A Neuromorphic Architecture,” Proc. Work. Mach. Learn. High-Performance Comput. Environ. - MLHPC ’15, pp. 1–4, 2015.

[149] D. Ma et al., “Darwin: A neuromorphic hardware co-processor based on spiking neural networks,” J. Syst. Archit., vol. 77, pp. 43–51, Jun. 2017.

[150] G.-L. Li et al., “Memristors with diffusive dynamics as synaptic emulators for neuromorphic computing,” Nat. Mater., vol. 16, no. 1, pp. 101–108, 2016.

[151] W. Maass, “Noise as a resource for computation and learning in networks of spiking neurons,” Proc. IEEE, 2014.

[152] E. Neftci, S. Das, B. Pedroni, K. Kreutz-Delgado, and G. Cauwenberghs, “Event-driven contrastive divergence for spiking neuromorphic systems,” Front. Neurosci., 2014.

[153] a Grüning and S. M. Bohte, “Spiking Neural Networks: Principles and Challenges,” Elen.Ucl.Ac.Be, 2014.

[154] F. Akopyan et al., “TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip,” IEEE Trans. Comput. Des. Integr. Circuits Syst., 2015.

[155] J. H. Lee, T. Delbruck, and M. Pfeiffer, “Training deep spiking neural networks using backpropagation,” Front. Neurosci., 2016.

[156] E. Yavuz, J. Turner, and T. Nowotny, “GeNN: A code generation framework for accelerated brain simulations,” Sci. Rep., 2016.

[157] G. Indiveri, F. Corradi, and N. Qiao, “Neuromorphic architectures for spiking deep neural networks,” Tech. Dig. - Int. Electron Devices Meet. IEDM, 2015.

[158] S. B. Furber, F. Galluppi, S. Temple, and L. A. Plana, “The SpiNNaker project,” Proc. IEEE, vol. 102, no. 5, pp. 652–665, 2014.

[159] W. Maass, “To Spike or Not to Spike: That Is the Question,” Proc. IEEE, vol. 103, no. 12, 2015.

[160] S. K. Esser et al., “Cognitive computing systems: Algorithms and applications for networks of neurosynaptic cores,” in Proceedings of the International Joint Conference on Neural Networks, 2013.

[161] J. Brea and W. Gerstner, “Does computational neuroscience need new synaptic learning paradigms?,” Current Opinion in Behavioral Sciences, vol. 11. 2016.

[162] A. V Bakhshiev and F. V Gundelakh, “Application the Spiking Neuron Model with Structural Adaptation to Describe Neuromorphic Systems,” in Procedia Computer Science, 2017, vol. 103, pp. 190–197.

[163] F. Y. H. Ahmed, B. Yusob, and H. N. A. Hamed, “Computing with spiking neuron networks a review,” Int. J. Adv. Soft Comput. its Appl., vol. 6, no. 1, 2014.

[164] S. K. Esser et al., “Convolutional Networks for Fast, Energy-Efficient Neuromorphic Computing,” Def. Adv. Res. Proj. Agency, no. Figure 1, 2016.

[165] Y. Zhang, P. Li, Y. Jin, and Y. Choe, “A Digital Liquid State Machine with Biologically Inspired Learning and Its Application to Speech Recognition,” IEEE Trans. Neural Networks Learn. Syst., vol. 26, no. 11, 2015.

[166] S. B. Furber et al., “Overview of the SpiNNaker system architecture,” IEEE Transactions on Computers, vol. 62, no. 12. 2013.

[167] L. F. Abbott, B. DePasquale, and R. M. Memmesheimer, “Building functional networks of spiking model neurons,” Nature Neuroscience, vol. 19, no. 3. pp. 350–355, 2016.

[168] D. Thalmeier, M. Uhlmann, H. J. Kappen, and R. M. Memmesheimer, “Learning Universal Computations with Spikes,” PLoS Comput. Biol., vol. 12, no. 6, 2016.

[169] P. A. Merolla et al., “A million spiking-neuron integrated circuit with a scalable communication network and interface,” Science (80-. )., vol. 345, no. 6197, pp. 668–673, 2014.

[170] J. Schmidhuber, “Deep Learning in neural networks: An overview,” Neural Networks, 2015.

[171] E. M. Izhikevich, Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting, vol. 25, no. 1. 2007.

[172] E. M. Izhikevich, “Which model to use for cortical spiking neurons?,” IEEE Trans. Neural Networks, vol. 15, no. 5, pp. 1063–1070, 2004.

[173] M. Eugene, Dynamical Systems in Neuroscience. 2007.

[174] J. L. McKinstry, G. M. Edelman, and J. L. Krichmar, “A cerebellar model for predictive motor control tested in a brain-based device.,” Proc. Natl. Acad. Sci. U. S. A., vol. 103, no. 9, pp. 3387–92, 2006.

[175] E. M. Izhikevich, Dynamical systems in neuroscience. 2007.

[176] K. A. Zaghloul and K. Boahen, “A silicon retina that reproduces signals in the optic nerve.,” J. Neural Eng., vol. 3, no. 4, pp. 257–67, 2006.

[177] K. P. K??rding and P. K??nig, “A spike based learning rule for generation of invariant representations,” in Journal of Physiology Paris, 2000, vol. 94, no. 5–6, pp. 539–548.

[178] E. M. Izhikevich, “Simple model of spiking neurons,” IEEE Trans. Neural Networks, vol. 14, no. 6, pp. 1569–1572, 2003.

[179] J. S. Seo et al., “A 45nm CMOS neuromorphic chip with a scalable architecture for learning in networks of spiking neurons,” in Proceedings of the Custom Integrated Circuits Conference, 2011.

[180] A. N. Burkitt, “A review of the integrate-and-fire neuron model: I. Homogeneous synaptic input,” Biol. Cybern., vol. 95, no. 1, pp. 1–19, 2006.

[181] E. B. Hendrickson, J. R. Edgerton, and D. Jaeger, “The capabilities and limitations of conductance-based compartmental neuron models with reduced branched or unbranched morphologies and active dendrites,” J. Comput. Neurosci., vol. 30, no. 2, pp. 301–321, 2011.

[182] C. Finke, J. A. Freund, E. Rosa, P. H. Bryant, H. A. Braun, and U. Feudel, “Temperature-dependent stochastic dynamics of the Huber-Braun neuron model,” Chaos, vol. 21, no. 4, 2011.

[183] R. FitzHugh, “Impulses and Physiological States in Theoretical Models of Nerve Membrane,” Biophys. J., vol. 1, no. 6, pp. 445–466, 1961.

[184] C. Finke et al., “Noisy activation kinetics induces bursting in the Huber-Braun neuron model,” Eur. Phys. J. Spec. Top., vol. 187, no. 1, pp. 199–203, 2010.

[185] D. Chung et al., “A New Robotics Platform for Neuromorphic Vision: Beobots,” Lect. Notes Comput. Sci., vol. 2525, pp. 325–340, 2002.

[186] J. Fleischer and G. Edelman, “Brain-based devices,” IEEE Robot. Autom. Mag., vol. 16, no. 3, pp. 33–41, 2009.

[187] B. Goertzel, R. Lian, I. Arel, H. de Garis, and S. Chen, “A world survey of artificial brain projects, Part II: Biologically inspired cognitive architectures,” Neurocomputing, vol. 74, no. 1–3, pp. 30–49, 2010.

[188] L. de Garis, Hugo; Shuo, Chen; Goertzel, Ben; Ruiting, “A world survey of artiﬁcial brain projects, Part I: Large-scale brain simulations,” Neurocomputing, vol. 74, pp. 3–29, 2010.

[189] H. A. Braun et al., “Noise-induced impulse pattern modifications at different dynamical period-one situations in a computer model of temperature encoding,” in BioSystems, 2001, vol. 62, no. 1–3, pp. 99–112.

[190] a L. Hodgkin and a F. Huxley, “A quantitative description of membrane current and its applicaiton to conduction and excitation in nerve,” J Physiol, vol. 117, no. 4. pp. 500–544, 1952.

[191] J. L. Krichmar and G. M. Edelman, “Machine psychology: autonomous behavior, perceptual categorization and conditioning in a brain-based device.,” Cereb. Cortex, vol. 12, no. 8, pp. 818–30, 2002.

[192] J. L. Krichmar and G. M. Edelman, “Brain-based devices for the study of nervous systems and the development of intelligent machines.,” Artif. Life, vol. 11, no. 1–2, pp. 63–77, 2005.

[193] M. H. Hennig, “Theoretical models of synaptic short term plasticity.,” Front. Comput. Neurosci., vol. 7, p. 45, 2013.

[194] P. H??fliger, “Adaptive WTA with an analog VLSI neuromorphic learning chip,” IEEE Trans. Neural Networks, vol. 18, no. 2, pp. 551–572, 2007.

[195] A. N. Burkitt, “A review of the integrate-and-fire neuron model: II. Inhomogeneous synaptic input and network properties,” Biological Cybernetics, vol. 95, no. 2. pp. 97–112, 2006.