X
Innovation

University boycott ends after KAIST confirms no 'killer robot' development

​The boycott involving over 50 researchers from 30 countries has ended after South Korea's KAIST university agreed to not partake in the development of lethal autonomous weapons.
Written by Asha Barbaschow, Contributor

The Korea Advanced Institute of Science and Technology (KAIST) has announced it will not be participating in the development of lethal autonomous weapons after it was revealed last week a group comprising over 50 international researchers would be boycotting all dealings with the South Korean university if it were to do so.

"KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots," KAIST president professor Sung-Chul Shin told ScienceInsiderin response to the boycott.

"KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control."

The boycott was announced last week, following reports KAIST was opening an artificial intelligence weapons lab in collaboration with Hanwha Systems, a defence company building cluster munitions despite United Nations bans, as well as a fully autonomous weapon, the SGR-A1 Sentry Robot.

Given the "swift and clear commitment to the responsible use of artificial intelligence in the development of weapons", the 56 AI and robotics researchers who were signatories to the boycott have rescinded the action, a statement from the University of New South Wales (UNSW) explains.

They will continue to visit and host researchers from KAIST, as well as work together on scientific projects, the statement continued.

"I was very pleased that the president of KAIST has agreed not to develop lethal autonomous weapons, and to follow international norms by ensuring meaningful human control of any AI-based weapon that will be developed," said UNSW scientia professor of artificial intelligence Toby Walsh. "I applaud KAIST for doing the right thing, and I'll be happy to work with KAIST in the future."

Walsh organised the boycott on behalf of the group of researchers, and said the statement from the KAIST president was all the 50-plus signatories were seeking.

"At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST looks to accelerate the arms race to develop such weapons," the letter sent to KAIST read.

"If developed, autonomous weapons will be the third revolution in warfare. They will permit war to be fought faster and at a scale greater than ever before. They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints.

"This Pandora's box will be hard to close if it is opened."

RELATED COVERAGE

Researchers boycott Korean university over 'killer robot' AI weapons lab

Over 50 researchers from 30 countries will be boycotting all contact with South Korea's KAIST university once it opens an artificial intelligence weapons lab.

AI 'more dangerous than nukes': Elon Musk still firm on regulatory oversight

The man building a spaceship to send people to Mars has used his South by Southwest appearance to reaffirm his belief that the danger of artificial intelligence is much greater than the danger of nuclear warheads.

Elon Musk among tech founders to call for UN to ban 'killer robots'

Founders of AI and robotics companies around the world are urging the UN to ban the development and use of autonomous weapons before it's too late.

Should we ban killer robots?

Academics and a group of NGOs have different opinions on how autonomous weapons should be defined and regulated.

Elon Musk fears AI may lead to World War III, as researchers study the risks of 'stupid, good robots' (TechRepublic)

Like most technology, there is a fine line between good and evil in its use. What happens when AI built with good intentions goes bad?

Editorial standards