Dynamical neuroscience
The dynamical systems approach to neuroscience is a branch of mathematical biology that utilizes nonlinear dynamics to understand and model the nervous system and its functions. In a dynamical system, all possible states are expressed by a phase space.[1] Such systems can experience bifurcation (a qualitative change in behavior) as a function of its bifurcation parameters and often exhibit chaos.[2] Dynamical neuroscience describes the non-linear dynamics at many levels of the brain from single neural cells[3] to cognitive processes, sleep states and the behavior of neurons in large-scale neuronal simulation.[4] Neurons have been modeled as nonlinear systems for decades now, but dynamical systems emerge in numerous other ways in the nervous system. From chemistry, chemical species models like the Gray–Scott model exhibit rich, chaotic dynamics.[5][6] Dynamic interactions between extracellular fluid pathways reshapes our view of intraneural communication.[7] Information theory draws on thermodynamics in the development of infodynamics which can involve nonlinear systems, especially with regards to the brain.
History
One of the first well-known incidences[spelling?] in which neurons were modeled on a mathematical and physical basis was the integrate-and-fire model developed in 1907. Decades later, the discovery of the squid giant axon eventually led Alan Hodgkin and Andrew Huxley (half-brother to Aldous Huxley) to develop the Hodgkin–Huxley model of the neuron in 1952.[8] This model was simplified with the FitzHugh–Nagumo model in 1962.[9] By 1981, the Morris–Lecar model had been developed for the barnacle muscle.
These mathematical models proved useful and are still used by the field of biophysics today, but a late 20th century development propelled the dynamical study of neurons even further: computer technology. The largest issue with physiological equations like the ones developed above is that they were nonlinear. This made the standard analysis impossible and any advanced kinds of analysis included a number of (nearly) endless possibilities. Computers opened a lot of doors for all of the hard sciences in terms of their ability to approximate solutions to nonlinear equations. This is the aspect of computational neuroscience that dynamical systems encompasses.
In 2007, a canonical text book was written by Eugene Izhikivech called Dynamical Systems in Neuroscience, assisting the transformation of an obscure research topic into a line of academic study.
Neuron dynamics
(intro needed here)
Electrophysiology of the neuron
The motivation for a dynamical approach to neuroscience stems from an interest in the physical complexity of neuron behavior. As an example, consider the coupled interaction between a neuron's membrane potential and the activation of ion channels throughout the neuron. As the membrane potential of a neuron increases sufficiently, channels in the membrane open up to allow more ions in or out. The ion flux further alters the membrane potential, which further affects the activation of the ion channels, which affects the membrane potential, and so on. This is often the nature of coupled nonlinear equations. A nearly straight forward example of this is the Morris–Lecar model:
- [math]\displaystyle{ \begin{align} C {dV \over dt} & = g_{Ca} M_{ss} (V-V_Ca) - g_K N (V-V_K) - g_L(V-V_L) + I_\text{app} \\[6pt] \end{align} }[/math]
See the Morris–Lecar paper[10] for an in-depth understanding of the model. A more brief summary of the Morris Lecar model is given by Scholarpedia.[11]
In this article, the point is to demonstrate the physiological basis of dynamical neuron models, so this discussion will only cover the two variables of the equation:
- [math]\displaystyle{ V }[/math] represents the membrane's current potential
- [math]\displaystyle{ N }[/math] is the so-called "recovery variable", which gives us the probability that a particular potassium channel is open to allow ion conduction.
Most importantly, the first equation states that the change of [math]\displaystyle{ V }[/math] with respect to time depends on both [math]\displaystyle{ V }[/math] and [math]\displaystyle{ N }[/math], as does the change in [math]\displaystyle{ N }[/math] with respect to time. [math]\displaystyle{ M_{ss} }[/math] and [math]\displaystyle{ N_{ss} }[/math] are both functions of [math]\displaystyle{ V }[/math]. So we have two coupled functions, [math]\displaystyle{ g(V,N) }[/math] and [math]\displaystyle{ g(V,N) }[/math].
Different types of neuron models utilize different channels, depending on the physiology of the organism involved. For instance, the simplified two-dimensional Hodgkins–Huxley model considers sodium channels, while the Morris–Lecar model considers calcium channels. Both models consider potassium and leak current. Note, however, that the Hodgkins–Huxley model is canonically four-dimensional.[12]
Excitability of neurons
One of the predominant themes in classical neurobiology is the concept of a digital component to neurons. This concept was quickly absorbed by computer scientists where it evolved into the simple weighting function for coupled artificial neural networks. Neurobiologists call the critical voltage at which neurons fire a threshold. The dynamical criticism of this digital concept is that neurons don't truly exhibit all-or-none firing and should instead be thought of as resonators.[13]
In dynamical systems, this kind of property is known as excitability. An excitable system starts at some stable point. Imagine an empty lake at the top of a mountain with a ball in it. The ball is in a stable point. Gravity is pulling it down, so it's fixed at the lake bottom. If we give it a big enough push, it will pop out of the lake and roll down the side of the mountain, gaining momentum and going faster. Let's say we fashioned a loop-de-loop around the base of the mountain so that the ball will shoot up it and return to the lake (no rolling friction or air resistance). Now we have a system that stays in its rest state (the ball in the lake) until a perturbation knocks it out (rolling down the hill) but eventually returns to its rest state (back in the lake). In this example, gravity is the driving force and spatial dimensions x (horizontal) and y (vertical) are the variables. In the Morris Lecar neuron, the fundamental force is electromagnetic and [math]\displaystyle{ V }[/math] and [math]\displaystyle{ N }[/math] are the new phase space, but the dynamical picture is essentially the same. The electromagnetic force acts along [math]\displaystyle{ V }[/math] just as gravity acts along [math]\displaystyle{ y }[/math]. The shape of the mountain and the loop-de-loop act to couple the y and x dimensions to each other. In the neuron, nature has already decided how [math]\displaystyle{ V }[/math] and [math]\displaystyle{ N }[/math] are coupled, but the relationship is much more complicated than the gravitational example.
This property of excitability is what gives neurons the ability to transmit information to each other, so it is important to dynamical neuron networks, but the Morris Lecar can also operate in another parameter regime where it exhibits oscillatory behavior, forever oscillating around in phase space. This behavior is comparable to pacemaker cells in the heart, that don't rely on excitability but may excite neurons that do.
Global neurodynamics
The global dynamics of a network of neurons depend on at least the first three of four attributes:
- individual neuron dynamics (primarily, their thresholds or excitability)
- information transfer between neurons (generally either synapses or gap junctions
- network topology
- external forces (such as thermodynamic gradients)
There are many combinations of neural networks that can be modeled between the choices of these four attributes that can result in a versatile array of global dynamics.
Biological neural network modeling
Biological neural networks can be modeled by choosing an appropriate biological neuron model to describe the physiology of the organism and appropriate coupling terms to describe the physical interactions between neurons (forming the network). Other global considerations must be taken into consideration, such as the initial conditions and parameters of each neuron.
In terms of nonlinear dynamics, this requires evolving the state of the system through the functions. Following from the Morris Lecar example, the alterations to the equation would be:
- [math]\displaystyle{ \begin{align} C {dV_i \over dt} & = g_{Ca} M_{ss} (V_i-V_{Ca}) - g_K N_i (V_i-V_K) - g_L(V_i-V_L) + I_\text{app} + D(V_i) \\[6pt] {dN_i \over dt} & = {{N_{ss} - N_i} \over {\tau_N}} \end{align} }[/math]
where [math]\displaystyle{ V }[/math] now has the subscript [math]\displaystyle{ i }[/math], indicating that it is the ith neuron in the network and a coupling function has been added to the first equation. The coupling function, D, is chosen based on the particular network being modeled. The two major candidates are synaptic junctions and gap junctions.
Attractor network
- Point attractors – memory, pattern completion, categorizing, noise reduction
- Line attractors – neural integration: oculomotor control
- Ring attractors – neural integration: spatial orientation
- Plane attractors – neural integration: (higher dimension of oculomotor control)
- Cyclic attractors – central pattern generators
- Chaotic attractors – recognition of odors and chaos is often mistaken for random noise.
Please see Scholarpedia's page for a formal review of attractor networks.[14]
Beyond neurons
While neurons play a lead role in brain dynamics, it is becoming more clear to neuroscientists that neuron behavior is highly dependent on their environment. But the environment is not a simple background, and there is a lot happening right outside of the neuron membrane, in the extracellular space. Neurons share this space with glial cells and the extracellular space itself may contain several agents of interaction with the neurons.[15]
Glia
Glia, once considered a mere support system for neurons, have been found to serve a significant role in the brain.[16][17] The subject of how the interaction between neuron and glia have an influence on neuron excitability is a question of dynamics.[18]
Neurochemistry
Like any other cell, neurons operate on an undoubtedly complex set of molecular reactions. Each cell is a tiny community of molecular machinery (organelles) working in tandem and encased in a lipid membrane. These organelles communicate largely via chemicals like G-proteins and neurotransmitters, consuming ATP for energy. Such chemical complexity is of interest to physiological studies of the neuron.
Neuromodulation
- Neurons in the brain live in an extracellular fluid, capable of propagating both chemical and physical energy alike through reaction-diffusion and bond manipulation that leads to thermal gradients. Volume transmission has been associated with thermal gradients caused by biological reactions in the brain.[19] Such complex transmission has been associated with migraines.[20]
Cognitive neuroscience
The computational approaches to theoretical neuroscience often employ artificial neural networks that simplify the dynamics of single neurons in favor of examining more global dynamics. While neural networks are often associated with artificial intelligence, they have also been productive in the cognitive sciences.[21] Artificial neural networks use simple neuron models, but their global dynamics are capable of exhibiting both Hopfield and Attractor-like network dynamics.
Hopfield network
The Lyapunov function is a nonlinear technique used to analyze the stability of the zero solutions of a system of differential equations. Hopfield networks were specifically designed such that their underlying dynamics could be described by the Lyapunov function. Stability in biological systems is called homeostasis. Particularly of interest to the cognitive sciences, Hopfield networks have been implicated in the role of associative memory (memory triggered by cues).[22]
See also
- Computational neuroscience
- Mathematical biology
- Nonlinear systems
- Dynamical systems
- Randomness
- Neural oscillation
References
- ↑ Gerstner, Wulfram; Kistler, Werner M.; Naud, Richard; Paninski, Liam (2014-07-24) (in en). Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. Cambridge University Press. ISBN 978-1-107-06083-8. https://books.google.com/books?id=D4j2AwAAQBAJ.
- ↑ Strogatz, Steven H. (2018-05-04) (in en). Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering. CRC Press. ISBN 978-0-429-97219-5. https://books.google.com/books?id=1kpnDwAAQBAJ.
- ↑ Izhikevich, E. Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Massachusetts: The MIT Press, 2007.
- ↑ "Agenda of the Dynamical Neuroscience XVIII: The resting brain: not at rest!". http://neuro.dgimeetings.com/Agenda.aspx.
- ↑ Wackerbauer, Renate; Showalter, Kenneth (2003-10-22). "Collapse of Spatiotemporal Chaos". Physical Review Letters (American Physical Society (APS)) 91 (17): 174103. doi:10.1103/physrevlett.91.174103. ISSN 0031-9007. PMID 14611350. Bibcode: 2003PhRvL..91q4103W. https://researchrepository.wvu.edu/cgi/viewcontent.cgi?article=1085&context=faculty_publications.
- ↑ Lefèvre, Julien; Mangin, Jean-François (2010-04-22). Friston, Karl J.. ed. "A Reaction-Diffusion Model of Human Brain Development". PLOS Computational Biology (Public Library of Science (PLoS)) 6 (4): e1000749. doi:10.1371/journal.pcbi.1000749. ISSN 1553-7358. PMID 20421989. Bibcode: 2010PLSCB...6E0749L.
- ↑ Agnati, L.F.; Zoli, M.; Strömberg, I.; Fuxe, K. (1995). "Intercellular communication in the brain: Wiring versus volume transmission". Neuroscience (Elsevier BV) 69 (3): 711–726. doi:10.1016/0306-4522(95)00308-6. ISSN 0306-4522. PMID 8596642.
- ↑ Hodgkin, A. L.; Huxley, A. F. (1952-08-28). "A quantitative description of membrane current and its application to conduction and excitation in nerve" (in en). The Journal of Physiology 117 (4): 500–544. doi:10.1113/jphysiol.1952.sp004764. ISSN 0022-3751. PMID 12991237.
- ↑ Izhikevich E. and FitzHugh R. (2006), Scholarpedia, 1(9):1349
- ↑ Morris, C.; Lecar, H. (1981). "Voltage oscillations in the barnacle giant muscle fiber". Biophysical Journal (Elsevier BV) 35 (1): 193–213. doi:10.1016/s0006-3495(81)84782-0. ISSN 0006-3495. PMID 7260316. Bibcode: 1981BpJ....35..193M.
- ↑ Lecar, H. (2007), Scholarpedia, 2(10):1333
- ↑ Hodgkin, A. and Huxley, A. (1952): A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117:500–544. PMID 12991237 [1]
- ↑ Izhikevich, E. Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. Massachusetts: The MIT Press, 2007.
- ↑ Eliasmith, C. (2007), Scholarpedia, 2(10):1380
- ↑ Dahlem, Yuliya A.; Dahlem, Markus A.; Mair, Thomas; Braun, Katharina; Müller, Stefan C. (2003-09-01). "Extracellular potassium alters frequency and profile of retinal spreading depression waves". Experimental Brain Research (Springer Science and Business Media LLC) 152 (2): 221–228. doi:10.1007/s00221-003-1545-y. ISSN 0014-4819. PMID 12879176.
- ↑ Ullian, Erik M.; Christopherson, Karen S.; Barres, Ben A. (2004). "Role for glia in synaptogenesis". Glia (Wiley) 47 (3): 209–216. doi:10.1002/glia.20082. ISSN 0894-1491. PMID 15252809. http://www.phy.duke.edu/research/chempatt/ResearchDocs/pdfs/glia47_3_2005_2.pdf. Retrieved 2010-08-07.
- ↑ Keyser, David O.; Pellmar, Terry C. (1994). "Synaptic transmission in the hippocampus: Critical role for glial cells". Glia (Wiley) 10 (4): 237–243. doi:10.1002/glia.440100402. ISSN 0894-1491. PMID 7914511.
- ↑ Nadkarni, S. (2005) Dynamics of Dressed Neurons: Modeling the Neural-Glial Circuit and Exploring its Normal and Pathological Implications. Doctoral dissertation. Ohio University, Ohio. [2]
- ↑ Fuxe, K., Rivera, A., Jacobsen, K., Hoistad, M., Leo, G., Horvath, T., Stained, W., De la calle, A. and Agnati, L. (2005) Dynamics of volume transmission in the brain. Focus on catecholamine and opioid peptide communication and the role of uncoupling protein 2. Journal of Neural Transmission, 112:1. [3]
- ↑ "Dahlem, M. (2009) Migraine and Chaos. SciLogs, 25 November". http://www.scilogs.eu/en/blog/gray-matters/2009-11-25/migraine_and_chaos.
- ↑ Gluck, M. 2001. Gateway to Memory: An Introduction to Neural Network Modeling of the Hippocampus and Learning. Massachusetts: MIT. [4]
- ↑ Hopfield, J. (2007), Scholarpedia, 2(5):1977
Original source: https://en.wikipedia.org/wiki/Dynamical neuroscience.
Read more |