Researchers that can mind read
“What are you thinking about?” Mind reading like science fiction characters is fascinating, but is just make-believe. In everyday life, when we are facing somebody, we can't know what that person is thinking or who that person is thinking about without asking. But what if researchers told you the opposite was true?
Alan S. Cowen (University of California Berkeley), Marvin M. Chun (Yale) and Brice A. Khul (New York University), all three of whom are researchers, carried out a study recently on the reconstruction of images from brain activity, therefore of thoughts. This is a huge first in the research field!
The volunteers for this project were shown 300 faces while the electrical activity of their brains was recorded at the same time by a functional MRI. Then the team of scientists showed them 30 new faces. The researchers then reconstructed these 30 new faces based only on the previously recorded activity models.
The images obtained are available in the publication on page 15. Although the reconstructions are blurred, they do look quite similar to the actual faces. What’s more, all the reconstructions have the correct skin color; and an average of 24 reconstructions in 30 correctly detected whether the person was smiling.
Reading thoughts has, however, turned out to be more painstaking when it comes to determining the color of the hair or the gender of the person. Two thirds of the reconstructions successfully distinguished the gender, and only half showed the correct hair color.
Given these results, Alan S. Cowen confirms that what they have just done “is mind reading”. He adds that it is perfectly possible to improve the results using more sophisticated mathematical models.
Once the technique has been honed, there is a range of possible applications: recording dreams, resolving crimes, or even a better understanding of mental problems. “You can see how people perceive faces depending on different disorders, like autism – and use that to help diagnose therapies,” he says.
Alan S. Cowen (University of California Berkeley), Marvin M. Chun (Yale) and Brice A. Khul (New York University), all three of whom are researchers, carried out a study recently on the reconstruction of images from brain activity, therefore of thoughts. This is a huge first in the research field!
The volunteers for this project were shown 300 faces while the electrical activity of their brains was recorded at the same time by a functional MRI. Then the team of scientists showed them 30 new faces. The researchers then reconstructed these 30 new faces based only on the previously recorded activity models.
The images obtained are available in the publication on page 15. Although the reconstructions are blurred, they do look quite similar to the actual faces. What’s more, all the reconstructions have the correct skin color; and an average of 24 reconstructions in 30 correctly detected whether the person was smiling.
Reading thoughts has, however, turned out to be more painstaking when it comes to determining the color of the hair or the gender of the person. Two thirds of the reconstructions successfully distinguished the gender, and only half showed the correct hair color.
Given these results, Alan S. Cowen confirms that what they have just done “is mind reading”. He adds that it is perfectly possible to improve the results using more sophisticated mathematical models.
Once the technique has been honed, there is a range of possible applications: recording dreams, resolving crimes, or even a better understanding of mental problems. “You can see how people perceive faces depending on different disorders, like autism – and use that to help diagnose therapies,” he says.
Source: Alan S. Cowen at al. Neural portraits of perception : Reconstructing face images from evoked brain activity. NeuroImage 2014, 94:12-22