New AI evaluates the “normality” of human faces

A website has appeared where artificial intelligence (AI), based on the user’s basic data from a photograph, estimates his normality. The same algorithms are used by Tinder and TikTok.

The researchers explained that facial recognition technology is increasingly used in people’s daily lives, but it often makes mistakes, is biased, and can manipulate data. New site called “How Normal Am I?” Talks about the risks of AI-assisted assessment using algorithms for assessing age, attractiveness, body mass index, life expectancy, and gender.

The site promises not to collect personal data or use cookies. This portal was created by researcher and artist Tizhmen Shep – he wanted to explore how artificial intelligence affects ethics and human rights. Each user can upload their photo to the site so that the AI ​​can rate its “normalcy” on a scale from 0 to 10.

The site’s creator noted that dating apps like Tinder use similar algorithms to connect two people she finds equally attractive, while social media platforms like TikTok use them to promote content to cuter users.

The algorithm was trained on thousands of photographs that were manually tagged with attractiveness ratings on social media, often university students. Because beauty standards can vary from country to country and from culture to culture, their perceptions have also been built into algorithms. After the algorithm estimates skin color, age, average lifespan, and overall attractiveness, it gives an average score – a result that most AIs on social media or dating sites would give.

“As facial recognition technology is introduced into our daily lives, it can create this subtle yet pervasive feeling that we are constantly being watched and evaluated. You may feel more pressure to behave “normal”, which for the algorithm means just being average. This is why we must defend our right to privacy, which is, in essence, our right to be different. We can say that privacy is the right to be imperfect, ”said the artist.

Tags: ,