“How Much might have improved our last few moments together if it weren’t for the delusions induced by YouTube?”. Last September, the son of a retired scientist of 80 years he told how his father, his father plunged into a spiral of toxic video during his last few years. It did so through a campaign of the Mozilla Foundation, created to raise awareness about the problems generated by the algorithm of recommendations of this popular platform video. Now, several researchers from the company Google, the owner of the audiovisual platform, have proposed a modification of this algorithm that would improve recommendations and increase the time that users remain connected.

The artificial intelligence that controls much of the information consumed today on the Internet. The algorithms created for the different platforms, “observed the activity of the users and infer things that may be of interest and propose them,” says Pablo Castells, a professor at the Escuela Politécnica Superior of the Universidad Autónoma of Madrid. “There are many ways of doing this, from the most trivial, as it is just offering the most popular, up to more complex forms that involve fixating on the behavior of each individual user”.

In the case of YouTube, the platform makes a first list of recommendations with several hundreds of related videos that the user is viewing and then you refine that list by taking into account their clicks, likes and other interactions. The result is that the billions of hours that you see every day on this platform, a 70% match videos recommended by the algorithm.

The different platforms working to improve this system, make it even more precise and hold for a few minutes more users in front of the screen and this is what it seems to have gotten a team of researchers from YouTube, according to an article published in the journal ACM Digital Library. “We showed that our proposals may lead to substantial improvements in the quality of the recommendations”, says the study.

To refine the recommendations, the researchers tested to give more importance to the videos that are located in the lower part of the list, as it is understood that if the user has clicked on those videos is because he has spent some time looking for it. Thanks to this modification the developers of the new algorithm to ensure that have achieved “substantial improvements both in the metrics of engagement and satisfaction.”

“it Is a clever way to address the problem,” says Castells, “because we know that there are areas of the screen that are more exposed, so you get a click in that area has less merit than that which is achieved in an element that is more hidden”.

A environment of hoaxes and echo chambers

however, this modification is still not resolving one of the biggest problems we have these algorithms. Because the system is optimized for users to continue watching videos, this tends to offer recommendations that reinforce the tastes or beliefs of the user, which can create an experience that excludes other opinions and to stimulate the generation of what is known as echo chambers.

In this sense, a research from Google on the impact of recommendation systems, published earlier this year, concluded that “the feedback loops in the recommendation systems can lead to echo chambers’, which can reduce a user’s exposure to content and, ultimately, change their view of the world.”

The algorithms created by different platforms, “observed the activity of the users and infer things that may be of interest and propose them,” says Pablo Castells, professor at the Higher Polytechnic School of the Autonomous University of Madrid

Also various studies conducted in recent years, including an experiment developed by journalists of THE COUNTRY, have shown that the algorithm tends to reward the videos, most extreme and controversial, even though they are full of hoaxes. “Three years ago, my former wife, who suffers from mental health issues, began watching videos of conspiracy theories and believed all. YouTube did not stop feeding their paranoia, fears and anxieties with videos, one after the other,” says another of the testimonies collected by the Mozilla Foundation.

“In the community of algorithms of recommendation there is a growing concern in this regard and there are increasingly more efforts to promote a recommendation responsible,” says Castells. According to this specialist, keep in mind that “the goals of the user and the companies are not necessarily aligned, as the company needs the user to be happy, but in a way that is profitable and that is achieved if the user is more connected time”. The problem, says this researcher, “is that the algorithm does not know when the user is happy and when he has entered on a mode-compulsive disorder”.

When Peppa Pig he killed his father

The algorithm has also been questioned for his lack of fitness to provide children’s content. According to a study published this year in the arXiv (a repository of scientific articles that are not peer-reviewed), “there is a 45% chance that a small child follows the recommendations of YouTube you find a video inappropriate in less than 10 clicks”.

The authors of this study claim that the problem lies in the fact that certain videos for adults use video content for children and the algorithm is not the difference. Examples there are thousands within the platform, from drawings in which Mickey Mouse is run over by a car, to others in which Peppa Pig appears at his father.

The solution, according to Castells, “would not offer content simply based on the volume of response, but by doing something more qualitative, identifying content types”. However, this computer warns not only of the technical complexity of the problem, but the “ethical dilemma that involves deciding what content is marked as inappropriate.”

The problems generated by these algorithms have led to Mozilla, a non-profit organization dedicated to free software, to create a campaign to alert about it. Thanks to this initiative we have collected hundreds of testimonials of people who have been affected by these recommendations or have seen a loved one be immersed in the spiral-toxic YouTube. “It is sad and frustrating to watch as a loved one is sinking more and more into this kind of dark influence, negative and harmful,” laments one of them.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

41  +    =  44