==History==
In 1967, Edmond M. Dewan published a paper in Nature demonstrating the control of Alpha waves, turning them on and off, to produce Morse code. (ref 5) Using an EEG machine, Dewan and his fellow researchers were able to send words and phrases by thought alone.
In 1967, Edmond M. Dewan published a paper in Nature demonstrating the control of Alpha waves, turning them on and off, to produce Morse code. (ref 5) Using an EEG machine, Dewan and his fellow researchers were able to send words and phrases by thought alone.
In 1976, Robert G. Malech was awarded United States Patent 3951134 for remotely monitoring and altering brainwaves using radio.(ref 6) This patent makes reference to demodulating the waveform, displaying it to an operator for viewing and passing this to a computer for further analysis.
In 1988, Farwell, L.A. & Donchin, D. produced a paper describing a method of transmitting linguistic information using the P300 response system. (ref 7) This system combined matching observed information to what the subject was thinking of. In this case, being able to select a letter from the alphabet that the subject was thinking of. In theory, any input could be used and a lexicon constructed.
United States Patent 6,011,991, granted January 4, 2000, describes a method of monitoring an individual’s brain waves remotely, for the purposes of communication. Filed December 7, 1998, the patent outlines a system that monitors an individual’s brainwaves via a sensor, then transmits this, specifically by satellite, to a computer for analysis. This analysis would determine if the individual was attempting to communicate a “word, phrase, or thought corresponding to the matched stored .
Today, the driving force appears to be silent communication with battlefield troops. A mere $4 million was provided to2009/2010 to develop such a system called “Silent Talk”. (ref 13) Much of the research is being conducted at The Cognitive NeuroSystems Lab at UC Irvine. (ref 14)
A further $4 million was allocated to the University of California to investigate computer-mediated “synthetic telepathy”.(ref 15) The research aims to detect and analyze the word-specific neural signals, using EEG, which occur before speech is vocalized, and to see if the patterns are generalizable. (ref 16)
The research is part of a wider $70 million project that began in 2000 which aims to develop hardware capable of adapting to the behavior of its user.(ref 17)
The research is part of a wider $70 million project that began in 2000 which aims to develop hardware capable of adapting to the behavior of its user.(ref 17)
Quite apart from linguistic information, images have been extracted from the brain. Researchers at Japan’s ATR Computational Neuroscience Laboratories have been able to reconstruct images that a subject can currently see. The ultimate goal of the unclassified project is to view both retinal and imagined images in real-time, including dreams. (ref 18)
No comments:
Post a Comment