AI Interpret Brain Data Produces Personally Attractive Images


Developers have succeeded in creating an AI to read our subjective thoughts of what makes looks attractive. This system illustrated this knowledge by its capability to produce new images that were found attractive to people. The outcomes can be used, for example, in decision-making, modeling preferences, and recognizing unconscious attitudes.

Developers at the University of Copenhagen and the University of Helsinki examined whether a computer would be competent to recognize the facial expression we think attractive and, based on this, form new images meeting our standards. The developers’ used AI to evaluate brain signals and merged the resulting brain-computer interface with a generative model of artificial faces. This allowed the computer to create facial images that appealed to individual decisions.

“Previously, we developed models that could recognize and regulate image characteristics, such as hair tone. Though, people mainly agree on who is blond and who laughs. Attractiveness is a difficult subject of research, as it is associated with social and emotional factors that are likely to play unconscious roles in our individual choices. Indeed, we find it challenging to describe what it is that makes something, or someone, beautiful: Beauty is in the eye of the beholder,” says Docent Michiel Spapé, Psychology and Logopedics department, University of Helsinki.

AI Interpret Brain Data
A computer created facial images that appealed to individual preferences. Credit: COGNITIVE COMPUTING -TUTKIMUSRYHMÄ

Choices Revealed By the Brain

At first, the developers provided GAN (generative adversarial neural network): It is the task of producing hundreds of artificial pictures. Thirty volunteers saw pictures one by one. Then, they were asked to pay attention to the faces in images they found attractive. While electroencephalography (EEG) is used to collect their brain responses.

“It worked similarly to the dating app Tinder: the volunteers ‘swiped right’ when crossed to an attractive face. Here, volunteers did not have to do anything but look at the images and swiped left or right. We marked their instantaneous brain reaction to the images,” Spapé explains.

The developers examined the EEG results with machine learning methods, combining individual EEG results through a brain-computer interface to a generative neural network.

“A brain-computer interface can read users’ views on the attractiveness of a series of images. By understanding their views, the AI system interprets brain responses. The generative neural network illustrating the face pictures can together compose an entirely new face picture by combining with what a particular person finds attractive,” says Academy Research Fellow and Associate Professor Tuukka Ruotsalo, who heads the project.

To examine the efficacy of their modeling, the developers produced new pictures for each volunteer, predicting they would find them individually attractive. Experimenting with them in a double-blind manner against matched controls, they found that the new images resembled the choices of the subjects with an accuracy of over 80%.

“The research shows that we can generate images that match personal choice by combining an artificial neural network to brain responses. Succeeding in evaluating attractiveness is very important, as this is such a touching, emotional part of the stimuli. The computer concept has been quite successful at classifying pictures based on objective models. By bringing in brain responses to the mix, we show it is conceivable to recognize and create pictures based on subjective characteristics, like personal taste,” Spapé explains.

Eventually, the research may serve society by improving the potential for computers to learn and frequently understand individual preferences, by the interaction between AI solutions and brain-computer interfaces.

Video: AI Interpret Brain Data Produces Personally Attractive Images


Leave a Reply

Your email address will not be published. Required fields are marked *