This is one of the most fascinating studies I’ve heard about in quite a while. According to an article in Neuroscience News, neuroscientists at the University of Toronto used EEG measurements to reconstruct images made in the brain. Study participants were shown pictures of faces while researchers recorded their EEG signals. Using the recorded signals and a technique based on machine learning algorithms, researchers were then able to recreate the images. Reading the brain like this has been done before using fMRI measurements, but the recreated images were grainy. In this experiment, researchers were able to fill in much more detail. As this technique is mastered, it could be used to help those who cannot communicate verbally, or it could be used for law enforcement purposes. A summary of the information is available at Neuroscience News, and the study itself will appear in an upcoming edition of the journal eNeuro. Read more about it here: http://neurosciencenews.com/ai-eeg-images-8546/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+neuroscience-rss-feeds-neuroscience-news+%28Neuroscience+News+Updates%29
Although this research is impressive, be aware that neurofeedback cannot read what is happening in a person’s brain as it unfolds. Such an ability is decades off, at least. Instead, neurofeedback uses EEG brainwaves and reflects them back to the trainee in such a way that the brain can teach itself new patterns and ways of being.