The British expert Michael Osborne sees great danger in the rapid spread of the voice robot ChatGPT and similar programs and calls for rapid regulation. “ChatGPT, like other language robots, can become a turbo for the spread of misinformation and propaganda,” said the scientist from the University of Oxford, who has been working on the development of artificial intelligence (AI) for years, in an interview with the German Press Agency.
ChatGPT is an application that uses artificial intelligence to create extensive responses to text input. The text robot can, among other things, answer questions in different languages, summarize and evaluate texts, write poems or computer programs.
Danger lies in personalization
Bots that spread misinformation are already a big problem today, said Osborne, who researches as a professor of machine learning. “It doesn’t take much science fiction to assume that voice robots are the next step in this field.”
It has already been proven that these programs can produce hate speech, even when their creators try to stop them. Artificial intelligence is particularly dangerous because it enables personalization on a large scale. “The language tools can tailor their message to specific groups, even specific individuals.”
The researcher therefore sees an urgent need to catch up in politics. “The status quo is not one that we want to continue. We currently rely on the tech companies to regulate themselves,” he emphasized. “The regulation of artificial intelligence is necessary and urgent. Unfortunately, the regulators have not yet moved as quickly as they should given the pace of change.”
Still great potential
Before a committee in the British House of Commons, Osborne recently called for regulation similar to that for nuclear weapons, as he sees similar destructive potential in the technology. Despite all the warnings, Osborne sees great potential in the technologies: “I firmly believe that AI can be a tool that serves human well-being.” Numerous challenges such as aging societies and the transition to climate-neutral societies could be supported by the tools.