Vienna summit suggests restriction of AI weapons

vienna-summit-suggests-restriction-of-ai-weapons

A global conference urged on Tuesday that the world should establish a set of regulations to regulate AI weapons while they are still in their infancy, referring to the issue as a “Oppenheimer moment” in time.

Artificial intelligence (AI), like gunpowder and the atomic bomb, has the potential to transform warfare, analysts argue, making human conflicts impossibly different and considerably more lethal.

Artificial intelligence (AI)

“This is our generation’s ‘Oppenheimer moment’ where geopolitical tensions threaten to lead a major scientific breakthrough down a very dangerous path for the future of humanity,” read the summary at the end of the two-day conference in Vienna.

Robert Oppenheimer, a physicist from the United States, contributed to the development of nuclear weapons during WWII.

Austria planned and hosted the two-day conference in Vienna, which drew around 1,000 attendees from over 140 nations, including political leaders, professionals, and members of civil society.

A final statement said the group “affirms our strong commitment to work with urgency and with all interested stakeholders for an international legal instrument to regulate autonomous weapons system”.

“We have a responsibility to act and to put in place the rules that we need to protect humanity… Human control must prevail in the use of force”, said the summary, which is to be sent to the UN secretary general.

All types of weaponry can be made into autonomous systems using AI, thanks to powerful sensors guided by algorithms that allow a computer to “see”.

This will allow for the detection, selection, and attack of human targets — or targets containing humans — without human interaction.

Most weapons are still in the concept or prototype stages, but Russia’s war in Ukraine has provided a look into their potential.

Remotely controlled drones are not new, but they are getting more autonomous and are employed by both sides.

Russia’s war in Ukraine

“Autonomous weapons systems will soon fill the world’s battlefields,” Austrian Foreign Minister Alexander Schallenberg warned on Monday, kicking off the conference.

He warned that the “time to agree on international rules and norms to ensure human control” has arrived.

In 2023, Austria, a neutral country eager to promote disarmament in international forums, proposed the first UN resolution to govern autonomous weapons systems, which received backing from 164 states.

‘Uncorrectable mistakes.’

A Vienna-based privacy advocacy group announced that it would file a complaint against ChatGPT in Austria, claiming that the “hallucinating” flagship AI tool has created incorrect replies that creator OpenAI cannot amend.

NOYB (“None of Your Business”) stated that there was no way to ensure the programmes gave accurate information. “ChatGPT keeps hallucinating—and not even OpenAI can stop it,” the group said in a statement.

The firm has openly admitted that it cannot fix false information produced by its generative AI tool and has failed to clarify where the data comes from and what ChatGPT stores about individuals, according to the organisation.

To read our blog on “IBM’s cutting-edge AI technology makes applications scalable,” click here

Exit mobile version