AI experts influence industry direction by boycotting research into killer robots

Thursday, May 03, 2018 by

The president of the prestigious Korea Advanced Institute of Science and Technology (KAIST) is backtracking on plans to build artificial intelligence (AI) military weaponry in violation of United Nations (U.N.) prohibitions following a mass boycott of the university by more than 50 of the world’s leading AI experts.

Sung-Chul Shin, who had previously signed an agreement with the arms company Hanwha Systems to develop “devastating” AI weapons known as “cluster munitions,” recently announced that he’s reneging on the cooperative plan due to pressure from industry leaders, many of whom say that it’s a really bad idea that threatens the continuity of humanity.

Since AI weapons have the potential to act on their own without human input or control, they pose a Terminator-type threat to the planet that could eventually wipe out the human race entirely. Tesla head Elon Musk has repeatedly warned about this, noting that even non-military AI have the potential to wipe out 95 percent or more of humanity.

In their boycott letter that ultimately convinced Shin to reverse course, the 50 AI experts wrote about their disappointment in KAIST’s position on the AI arms race.

“At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST looks to accelerate the arms race to develop such weapons,” they stated.

“We therefore publicly declare that we will boycott all collaborations with any part of KAIST until such time as the President of KAIST provides assurances, which we have sought but not received, that the Center will not develop autonomous weapons lacking meaningful human control.”

AI technologies are anti-human, and can’t be allowed to proliferate

Keep in mind that KAIST is currently home to the world’s leading robot and science labs, where everything from liquid batteries to disease sensors are currently under development. Being publicly shamed by experts in AI, another leading technological front, isn’t something that the school wants tacked onto its reputation, hence the decision by Shin to end its participation in developing AI military weapons.

“It goes to show the power of the scientific community when we choose to speak out – our action was an overnight success,” stated Toby Walsh from the University of New South Wales (UNSW) in Australia.

“I was very pleased that the president of KAIST has agreed not to develop lethal autonomous weapons, and to follow international norms by ensuring meaningful human control of any AI-based weapon that will be developed.”

In his own statement given to the media, Shin explained that his institution values “human rights and ethical standards to a very high degree.” As a result, it “will not conduct any research activities counter to human dignity, including autonomous weapons lacking meaningful human control.”

Should any other institution or government successfully develop autonomous weapons, however, the consequences would be nothing short of devastating. Many are calling such a scenario the “third revolution in warfare,” where advanced AI weapons like missiles, rockets, and bombs start launching on their own, completely independent of human input.

It would be mass mayhem, for one. It would also presumably lead to global mass terrorism by “smart” robots – a “Pandora’s box” of weapon use that, once opened, could never again be closed and contained.

The U.N. has already scheduled a meeting in Geneva, Switzerland, to discuss the issues surrounding lethal autonomous weapons systems, or LAWS, to develop more standards for moving forward. Many different people and organizations, from the Red Cross to Elon Musk himself, are calling for strict limits or even blanket bans on such systems.

For more news on AI technologies and how governments of the world are handling their development, visit AISystems.news.

Sources for this article include:

ScienceAlert.com

NaturalNews.com



Comments

comments powered by Disqus