Brain-reading

Brain-reading uses the responses of multiple voxels in the brain evoked by stimulus then detected by fMRI in order to decode the original stimulus. Brain reading studies differ in the type of decoding (i.e. classification, identification and reconstruction) employed, the target (i.e. decoding visual patterns, auditory patterns, cognitive states), and the decoding algorithms (linear classification, nonlinear classification, direct reconstruction, Bayesian reconstruction, etc.) employed.

Classification

In classification, a pattern of activity across multiple voxels is used to determine the particular class from which the stimulus was drawn.[1] Many studies have classified visual stimuli, but this approach has also been used to classify cognitive states.

Reconstruction

In reconstruction brain reading the aim is to create a literal picture of the image that was presented. Early studies used voxels from early visual cortex areas (V1, V2, and V3) to reconstruct geometric stimuli made up of flickering checkerboard patterns.[2][3]

Natural images

More recent studies used voxels from early and anterior visual cortex areas forward of them (visual areas V3A, V3B, V4, and the lateral occipital) together with Bayesian inference techniques to reconstruct complex natural images. This brain reading approach uses three components:[4] A structural encoding model that characterizes responses in early visual areas; a semantic encoding model that characterizes responses in anterior visual areas; and a Bayseian prior that describes the distribution of structural and semantic scene statistics.[4]

Experimentally the procedure is for subjects to view 1750 black and white natural images that are correlated with voxel activation in their brains. Then subjects viewed another 120 novel target images, and information from the earlier scans is used reconstruct them. Natural images used include pictures of a seaside cafe and harbor, performers on a stage, and dense foliage.[4]

Other types

It is possible to track which of two forms of rivalrous binocular illusions a person was subjectively experiencing from fMRI signals.[5] The category of event which a person freely recalls can be identified from fMRI before they say what they remembered.[6] Statistical analysis of EEG brainwaves has been claimed to allow the recognition of phonemes,[7] and at a 60% to 75% level color and visual shape words.[8] It has also been shown that brain-reading can be achieved in a complex virtual environment.[9]

Accuracy

Brain-reading accuracy is increasing steadily as the quality of the data and the complexity of the decoding algorithms improve. In one recent experiment it was possible to identify which single image was being seen from a set of 120.[10] In another it was possible to correctly identify 90% of the time which of two categories the stimulus came and the specific semantic category (out of 23) of the target image 40% of the time.[4]

Limitations

It has been noted that so far brain reading is limited. "In practice, exact reconstructions are impossible to achieve by any reconstruction algorithm on the basis of brain activity signals acquired by fMRI. This is because all reconstructions will inevitably be limited by inaccuracies in the encoding models and noise in the measured signals. Our results demonstrate that the natural image prior is a powerful (if unconventional) tool for mitigating the effects of these fundamental limitations. A natural image prior with only six million images is sufficient to produce reconstructions that are structurally and semantically similar to a target image."[4]

See also

References

  1. Kamitani, Yukiyasu; Tong, Frank (2005). "Decoding the visual and subjective contents of the human brain". Nature Neuroscience 8 (5): 679–85. doi:10.1038/nn1444. PMC 1808230. PMID 15852014.
  2. Miyawaki, Y; Uchida, H; Yamashita, O; Sato, M; Morito, Y; Tanabe, H; Sadato, N; Kamitani, Y (2008). "Visual Image Reconstruction from Human Brain Activity using a Combination of Multiscale Local Image Decoders". Neuron 60 (5): 915–29. doi:10.1016/j.neuron.2008.11.004. PMID 19081384.
  3. Thirion, Bertrand; Duchesnay, Edouard; Hubbard, Edward; Dubois, Jessica; Poline, Jean-Baptiste; Lebihan, Denis; Dehaene, Stanislas (2006). "Inverse retinotopy: Inferring the visual content of images from brain activation patterns". NeuroImage 33 (4): 1104–16. doi:10.1016/j.neuroimage.2006.06.062. PMID 17029988.
  4. 4.0 4.1 4.2 4.3 4.4 Naselaris, Thomas; Prenger, Ryan J.; Kay, Kendrick N.; Oliver, Michael; Gallant, Jack L. (2009). "Bayesian Reconstruction of Natural Images from Human Brain Activity". Neuron 63 (6): 902–15. doi:10.1016/j.neuron.2009.09.006. PMID 19778517.
  5. Haynes, J; Rees, G (2005). "Predicting the Stream of Consciousness from Activity in Human Visual Cortex". Current Biology 15 (14): 1301–7. doi:10.1016/j.cub.2005.06.026. PMID 16051174.
  6. Polyn, S. M.; Natu, VS; Cohen, JD; Norman, KA (2005). "Category-Specific Cortical Activity Precedes Retrieval During Memory Search". Science 310 (5756): 1963–6. doi:10.1126/science.1117645. PMID 16373577.
  7. Suppes, Patrick; Perreau-Guimaraes, Marcos; Wong, Dik Kin (2009). "Partial Orders of Similarity Differences Invariant Between EEG-Recorded Brain and Perceptual Representations of Language". Neural Computation 21 (11): 3228–69. doi:10.1162/neco.2009.04-08-764. PMID 19686069.
  8. Suppes, Patrick; Han, Bing; Epelboim, Julie; Lu, Zhong-Lin (1999). "Invariance of brain-wave representations of simple visual images and their names". Proceedings of the National Academy of Sciences of the United States of America 96 (25): 14658–63. doi:10.1073/pnas.96.25.14658. PMC 24492. PMID 10588761.
  9. Chu, Carlton; Ni, Yizhao; Tan, Geoffrey; Saunders, Craig J.; Ashburner, John (2010). "Kernel regression for fMRI pattern prediction". NeuroImage 56 (2): 662–673. doi:10.1016/j.neuroimage.2010.03.058. PMC 3084459. PMID 20348000.
  10. Kay, Kendrick N.; Naselaris, Thomas; Prenger, Ryan J.; Gallant, Jack L. (2008). "Identifying natural images from human brain activity". Nature 452 (7185): 352–5. doi:10.1038/nature06713. PMC 3556484. PMID 18322462.

External links