找回密码
 注册入学

QQ登录

只需一步,快速开始

查看: 2108|回复: 0

Worried over robot war

[复制链接]
 楼主| 发表于 2015-8-16 13:22:37 | 显示全部楼层 |阅读模式
                It sounds like a science-fiction nightmare. But “killer robots” have the likes of British scientist Stephen Hawking and Apple co-founder Steve Wozniak fretting and warning the machines could fuel ethnic cleansings and an arms race.
                           机器人变杀手,听上去很像科幻小说中的情节,但是英国科学家霍金以及苹果联合创始人沃兹尼亚克都对此忧心忡忡。他们警告世人:这样的机器可能会引发种族清洗和军备竞赛。
Autonomous weapons, which use artificial intelligence to select targets without human intervention, have been described as “the third revolution in warfare, after gunpowder and nuclear arms,” about 1,000 tech bigwigs wrote in an open letter on July 28.
                   7月28日,大约1000名科技界的大人物联名签署公开信,信中表示:自主武器可以使用人工智能选择目标,不需要人力介入,这样的技术也被形容为“继火药和核武器之后的第三次战争革命”。
Unlike drones, which require a human hand in their action, this kind of robot would have some autonomous decision-making abilities and the capacity to act on its own authority.
   无人机还需要人来操控其行动,杀手机器人与其不同的是,他们某种程度上拥有自主决策能力,以及自我行动的能力。
“The key question for humanity today is whether to start a global AI (artificial intelligence) arms race or to prevent it from starting,” they wrote.
   科学家在信中表示,“当今人性的关键问题在于,是去开启一次全球的人工智能军备竞赛,还是将这样的趋势扼杀在摇篮中。”
“If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable,” said the letter released at the opening of the 2015 International Joint Conference on Artificial Intelligence in Buenos Aires.
   2015年的国际人工智能联合大会在布宜诺斯艾利斯举行,这封信在大会的开幕式上公布:“如果有军事力量开始推动人工智能武器发展,全球军备竞赛将不可避免”。
The idea of an automated killing machine – made famous by Arnold Schwarzenegger’s Terminator – is moving swiftly from science fiction to reality, according to the scientists.
   自主化杀人机器的理念,从施瓦辛格的《终结者》电影开始为人熟知。而科学家们认为,这一概念正从科幻小说中进入到现实世界。“部署这类系统——特别是非法部署——可以在短短数年时间内完成,不需要几十年的时间。”
Lower bar for entry
   门槛低
The development of such weapons, while potentially reducing the extent of battlefield casualties, might also lower the threshold for going to battle, noted the scientists.
   科学家表示,这类武器的发展,有可能降低战场伤亡,但同时也可能降低了战争爆发的门槛。
The scientists painted an apocalyptic scenario in which autonomous weapons fall into the hands of terrorists, dictators or warlords hoping to carry out ethnic cleansings.
   而如果自主化武器落入恐怖分子、独裁者手中,或者被军阀用于种族清洗,那人类将要大难临头。
The group concluded with an appeal for a “ban on offensive autonomous weapons beyond meaningful human control.”
   科学家们请求“在有意义的人类控制基础上,禁止攻击性自主化武器”。
In a 2014 BBC interview, Hawking said the development of full artificial intelligence could spell the end of the human race.
   在2014年BBC的采访中,霍金表示全方位人工智能的发展,可能会把人类推向末日。
“It would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded,” he said.
   他表示,“他们可以自行启动,以惊人的速率重塑自身。而人类受限于缓慢的生物进化,无法与其匹敌,最终会被其取代。”
Authorities are gradually waking up to the risk of robot wars. Last May, for the first time, the United Nations brought governments together to begin talks on so-called “lethal autonomous weapons systems” that can select targets and carry out attacks without direct human intervention.
   有关部门也逐渐意识到机器人战争的危险性。去年5月,联合国第一次将众多政府部分汇聚在一起,讨论所谓“致命自主武器系统”可以在没有直接人力干预的情况下,选择目标并且实施攻击的问题。
In 2012, the US government imposed a 10-year human control requirement on automated weapons.
   2012年,美国政府强制规定自主武器需要10年人工控制。
There have been examples of weapons being stopped in their infancy.
   自主武器在发展初期就被终止的例子屡见不鲜。
After UN-backed talks, blinding laser weapons were banned in 1998, before they ever hit the battlefield.
   1998年,经过联合国支持下的讨论,激光致盲武器在真正亮相战场之前就被禁止。
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 注册入学

本版积分规则

联系我们|Archiver|小黑屋|手机版|滚动|柠檬大学 ( 京ICP备13050917号-2 )

GMT+8, 2024-4-30 04:40 , Processed in 0.046043 second(s), 15 queries .

Powered by Discuz! X3.5 Licensed

© 2001-2023 Discuz! Team.

快速回复 返回顶部 返回列表