Biological neural network
In neuroscience, a biological neural network is a series of interconnected neurons whose activation defines a recognizable linear pathway. The interface through which neurons interact with their neighbors usually consists of several axon terminals connected via synapses to dendrites on other neurons. If the sum of the input signals into one neuron surpasses a certain threshold, the neuron sends an action potential (AP) at the axon hillock and transmits this electrical signal along the axon.
Biological neural networks have inspired the design of artificial neural networks.
Early study
Early treatments of neural networks can be found in Herbert Spencer's Principles of Psychology, 3rd edition (1872), Theodor Meynert's Psychiatry (1884), William James' Principles of Psychology (1890), and Sigmund Freud's Project for a Scientific Psychology (composed 1895).[1] The first rule of neuronal learning was described by Hebb in 1949, Hebbian learning. Thus, Hebbian pairing of pre-synaptic and post-synaptic activity can substantially alter the dynamic characteristics of the synaptic connection and therefore facilitate or inhibit signal transmission. The neuroscientists Warren Sturgis McCulloch and Walter Pitts published the first works on the processing of neural networks.[2] They showed theoretically that networks of artificial neurons could implement logical, arithmetic, and symbolic functions. Simplified models of biological neurons were set up, now usually called perceptrons or artificial neurons. These simple models accounted for neural summation (i.e., potentials at the post-synaptic membrane will summate in the cell body). Later models also provided for excitatory and inhibitory synaptic transmission.
Connections between neurons
The connections between neurons are much more complex than those implemented in neural computing architectures. The basic kinds of connections between neurons are chemical synapses and electrical gap junctions. One principle by which neurons work is neural summation, i.e. potentials at the post synaptic membrane will sum up in the cell body. If the depolarization of the neuron at the axon goes above threshold an action potential will occur that travels down the axon to the terminal endings to transmit a signal to other neurons. Excitatory and inhibitory synaptic transmission is realized mostly by inhibitory postsynaptic potentials and excitatory postsynaptic potentials.
On the electrophysiological level, there are various phenomena which alter the response characteristics of individual synapses (called synaptic plasticity) and individual neurons (intrinsic plasticity). These are often divided into short-term plasticity and long-term plasticity. Long-term synaptic plasticity is often contended to be the most likely memory substrate. Usually the term "neuroplasticity" refers to changes in the brain that are caused by activity or experience.
Connections display temporal and spatial characteristics. Temporal characteristics refer to the continuously modified activity-dependent efficacy of synaptic transmission, called spike-dependent synaptic plasticity. It has been observed in several studies that the synaptic efficacy of this transmission can undergo short-term increase (called facilitation) or decrease (depression) according to the activity of the presynaptic neuron. The induction of long-term changes in synaptic efficacy, by long-term potentiation (LTP) or depression (LTD), depends strongly on the relative timing of the onset of the excitatory postsynaptic potential and the postsynaptic action potential. LTP is induced by a series of action potentials which cause a variety of biochemical responses. Eventually, the reactions cause the expression of new receptors on the cellular membranes of the postsynaptic neurons or increase the efficacy of the existing receptors through phosphorylation.
Backpropagating action potentials cannot occur because after an action potential travels down a given segment of the axon, the voltage gated sodium channels' (Na+ channels) m gate becomes closed, thus blocking any transient opening of the h gate from causing a change in the intracellular [Na+], and preventing the generation of an action potential back towards the cell body. In some cells, however, neural backpropagation does occur through the dendritic arbor and may have important effects on synaptic plasticity and computation.
A neuron in the brain requires a single signal to a neuromuscular junction to stimulate contraction of the postsynaptic muscle cell. In the spinal cord, however, at least 75 afferent neurons are required to produce firing. This picture is further complicated by variation in time constant between neurons, as some cells can experience their EPSPs over a wider period of time than others.
While in synapses in the developing brain synaptic depression has been particularly widely observed it has been speculated that it changes to facilitation in adult brains.
Study methods
Different neuroimaging techniques have been developed to investigate the activity of neural networks. The use of "brain scanners" or functional neuroimaging to investigate the structure or function of the brain is common, either as simply a way of better assessing brain injury with high resolution pictures, or by examining the relative activations of different brain areas. Such technologies may include fMRI (functional magnetic resonance imaging), PET (positron emission tomography) and CAT (computed axial tomography). Functional neuroimaging uses specific brain imaging technologies to take scans from the brain, usually when a person is doing a particular task, in an attempt to understand how the activation of particular brain areas is related to the task. In functional neuroimaging, especially fMRI, which measures hemodynamic activity that is closely linked to neural activity, PET, and electroencephalography (EEG) is used.
Connectionist models serve as a test platform for different hypotheses of representation, information processing, and signal transmission. Lesioning studies in such models, e.g. artificial neural networks, where parts of the nodes are deliberately destroyed to see how the network performs, can also yield important insights in the working of several cell assemblies. Similarly, simulations of dysfunctional neurotransmitters in neurological conditions (e.g., dopamine in the basal ganglia of Parkinson's patients) can yield insights into the underlying mechanisms for patterns of cognitive deficits observed in the particular patient group. Predictions from these models can be tested in patients or via pharmacological manipulations, and these studies can in turn be used to inform the models, making the process iterative.
See also
- Artificial neural network
- Biological neuron models
- Central pattern generator
- Connectionism
- Feedback
- List of regions in the human brain
- Neural backpropagation
- Neural computing
- Neural oscillation
- Pulse-coupled networks
- Synaptic plasticity
References
- ↑ Michael S. C. Thomas; James L. McClelland. "Connectionist models of cognition" (PDF). Stanford University.
- ↑ J. Y. Lettvin; H. R. Maturana; W. S. McCulloch; W. H. Pitts (1959), "What the frog's eye tells the frog's brain.", Proc. Inst. Radio Engr. (47), pp. 1940–1951
Further reading
- Intrinsic plasticity Robert H. Cudmore, Niraj S. Desai Scholarpedia 3(2):1363. doi:10.4249/scholarpedia.1363
External links
- Comparison of Neural Networks in the Brain and Artificial Neural Networks
- Lecture notes at MIT OpenCourseWare
- Computation in the Brain
- Biological Neural Network Toolbox - A free Matlab toolbox for simulating networks of several different types of neurons
- WormWeb.org: Interactive Visualization of the C. elegans Neural Network - C. elegans, a nematode with 302 neurons, is the only organism for whom the entire neural network has been uncovered. Use this site to browse through the network and to search for paths between any 2 neurons.
- Introduction to Neurons and Neuronal Networks, Neuroscience Online (electronic neuroscience textbook)
- Delaying Pulse Networks (Wave Interference Networks)