YouTube is where my mother put videos of Dire Straits to cook, where my father laughs with The Modern Life and my nephew devours tutorials of video games. But it is also the bonfire around which the extreme right-wing global strengthens its ties and the mosquito that spreads dangerous rumours about vaccines. My mother, YouTube will chains to Mark Knopfler and Eric Clapton, because he understands their tastes viejunos. But that recommendation system can become the enemy of the truth if you are looking for information on topical issues, such as the climate crisis. There is the great problem from YouTube: the machine offers so that it will keep us more time watching videos and content more sensational and controversial tend to be the palm. That’s why in many cases makes us fall for a spiral of contents intoxicadores. Ev Williams, one of the creators of Twitter, used this analogy to explain the mechanisms of recommendation algorithmic: if your drivers stop to look at when there is an accident on the road, the algorithm would interpret that that is what we like to do and would send us on roads full of accidents for us to enjoy those views.

such is the economy of attention: companies who design programs that keep us hooked to the screens. That’s why the spread of lies in these digital platforms have a solution so difficult, because it affects your own spinal cord: it is impossible to stop it without sacrificing benefits. Limit the virality directly attacks the business model. YouTube has helped to spread lies on a global scale and now, after complaints such as Avaaz, works to reduce the dimensions of the tsunami of misinformation. Is the minimum: the charlatans have the right to speak, there would be no more, but do not have the right to the global dissemination of their lies, that gives them the recommendation system of YouTube, which is the second search engine most used on the planet.

But this is “only the tip of the iceberg of a dangerous nexus that unites the benefits, on a global scale and artificial intelligence”, as warns the techno-sociologist Zeynep Tufekci. Over the years, the platform of videos from Google has been the realm of the “factual alternative”, the place in which to find “what the media hide”, the place that conspiracy theories become “truths untold”. This has led to the dynamic of propaganda, through the online communities and the culture of the influencers, that go beyond the algorithms, as denounced by Rebecca Lewis, of the University of Stanford. The algorithms are more of a factor in a broader set of issues and incentives social, economic and technical built-in to the platform. There is space on the network for the “alternative information”, even if that information travels so parallel to the reality that never comes to cross paths with her. But it is unacceptable that the disinformation generated deliberately to intoxicate the citizenship to receive a broadcast while you fill the pockets of Silicon Valley.