
A group of Nobel Prize winners gathered at the end of last month to discuss artificial intelligence and the end of the world. It seems the beginning of the plot of a bad science fiction film, but unfortunately this is not the case.
Human judgment continues to be fundamental when the nuclear weapons are decided.
But according to a group of scientists and experts in nuclear energy, who recently gathered to reflect on the impact of AI on human decision -making, it is a matter of When, and notartificial intelligence will eventually get access to nuclear codes.
“It’s like electricity,” he says Bob LatiffReformed Major-General of the US Air Force and member of the Bulletin of the Atomic Scientist Science and Security Council. “It will infiltrate everything.”
Last year, the Air Force General Anthony Cottonresponsible for the American arsenal of nuclear missiles, boasted in a defense conference that the Pentagon is betting everything on AI, arguing that he will “Improve our abilities of decision making. ”
The general did not reach the point of stating that we should let the AI take full control of nuclear weaponry. “We should never allow artificial intelligence to make these decisions for us,” he said at the time.
One of the public personalities that the most concern has manifested with the rise of AI is James Cameron. Last year, for example, the British director predicted the “”.
In a recent interview with, Cameron again warned that the use of artificial intelligence in a global arms race It could lead to the type of dystopia that has prophesied since 1984 in its “Terminator” saga.
According to Cameron, humans face Three existential threats: Superintelligence, nuclear weapons and climate crisis.
“I think There is still the danger of an apocalypse in the ending style, where Gather AI with weaponry systemseven at the level of nuclear weapons systems, nuclear defense, counterattack, all this, ”said Cameron.
“The theater of operations is so fast, the decision windows are so fast, it would take a superintelligence to be able to process it“But we may be smart and keep a human on the circuit,” said the director.
“But Humans are fallibleand there were many made mistakes that put us even that they could have led to the nuclear war. That’s why I don’t know, ”adds Cameron.
“The climate and general degradation of the natural world, nuclear weapons, superintelligence… They are all in a way to manifest and reach the peak at the same time. Perhaps the superintelligence is the answer“.
Perhaps.