手机APP下载

您现在的位置: 首页 > 在线广播 > VOA慢速英语 > VOA慢速-科技报道 > 正文

VOA慢速英语(翻译+字幕+讲解):活动人士呼吁禁止全自动武器

来源:可可英语 编辑:Sunny   可可英语APP下载 |  可可官方微信:ikekenet
  
  • From VOA Learning English, this is the Technology Report.
  • 这里是美国之音慢速英语科技报道。
  • An international coalition is calling for a ban on fully autonomous weapons known as "killer robots."
  • 一国际联盟呼吁就名叫“杀手机器人”的全自动武器实施禁令,
  • The 45-member coalition proposed the ban to governments last month at a meeting of the United Nations Convention on Conventional Weapons in Geneva, Switzerland.
  • 在上月瑞士日内瓦举行的联合国常规武器会议上,这个有着45个成员国的联盟对各国政府提出该禁令。
  • The Campaign to Stop Killer Robots wants the U.N. organization to add fully autonomous weapons to its work program in 2014.
  • “终止杀人机器人运动”希望该联合国组织能在2014年将全自动武器列入其工作日程。
  • Scientists have yet to develop fully autonomous killer robots.
  • 科学家尚未开发出全自动的杀人机器人,
  • However, technology is moving toward increasing autonomy. Such weapons would identify and attack targets without human assistance.
  • 然而,科技正朝着越来越自动化的方向发展。这类武器可以不经过人类帮助而识别并袭击目标。
  • Noel Sharkey is a founding member of the Campaign to Stop Killer Robots.
  • 诺尔·沙基是“终止杀人机器人运动”的成立者之一,
  • He also chairs the International Committee for Robot Arms Control. Mr Sharkey says autonomous weapons should be banned.
  • 他还是国际机器人武器控制委员会主席,他说必须禁止自动武器。
  • "The big problem for me is that there are no robot systems that can discriminate between civilian targets and military targets unless they are very, very clearly marked in some way.
  • “在我看来大的问题就是,没有哪个机器人系统能区分民事目标和军事目标,除非能以某种方式清楚地标明这些目标。
  • so, the idea of having robots going out into the field and selecting their own targets is to me, is just horrifying. It cannot work, " said Sharkey.
  • 所以,在我看来,将机器人武器进入战场,让它们自己选择目标是很可怕的,这样是不行的。”
  • Activists say robotic systems with different degrees of autonomy are already in use by Britain, Israel, the United States and South Korea.
  • 活动人士称具有不同自动级别的机器人系统已经在英国、以色列、美国和韩国得到应用,
  • They believe China and Russia are also moving toward these systems.
  • 他们认为中国和俄罗斯也在推进这些系统。
  • Steve Goose is a member of the campaign, he also directs the Arms Division at Human Rights Watch.
  • 史蒂夫·古斯是该运动的成员,他还是人权观察组织武器部门负责人。
  • He warns the killer robots will become a reality unless governments act now to ban them.
  • 他警告说,除非政府现在就采取行动加以禁止,否则杀人机器人将成为现实。
  • He says the world should oppose the weapons system that would be able to identify and attack targets mechanically.
  • 他说全世界应该反对那种能自动识别并袭击目标的武器系统。
  • He believes such a system crosses a basic moral and ethical line.
  • 他认为这样的系统越过了基本的伦理和道德底限。
  • "Robotic weapons systems should not make life and death decisions on the battlefield.That is simply inherently wrong.
  • “不应该让机器人武器系统在战场上做出生死抉择,这在本质上就是错误的。
  • So, they need to be banned on ethical grounds. We think they also need to be banned on legal grounds.
  • 所以,需要从道德的角度来禁止它们。我们认为还应该从法律的立场来加以禁止。
  • If and when a killer robot commits a war crime, violates international humanitarian law,
  • 如果杀人机器人犯下战争罪、违反了国际人道主义法律,
  • who would be held accountable, who would be responsible for that violation?" said Goose.
  • 那么由谁来为这样的侵犯行为负责呢?”
  • He adds that in recent months, fully autonomous weapons have gone from a little known issue to one that is commanding worldwide attention.
  • 他说最近几个月来,全自动武器已从少为人知的问题转化为全世界关注的问题,
  • He says that since last May, 34 countries have openly expressed concern about the dangers the weapons present.
  • 他说今年5月份,34个国家公开表示担心这种武器带来的危险。
  • Mr Goose notes that in 1995, the Convention on Conventional Weapons added a new policy to the treaty, which barred the use of blinding lasers.
  • 古斯说在1995年,常规武器公约加上一条新政策,即禁止激光致盲武器的使用。
  • He believes killer robots could become the second such weapon to be banned before it is ever used in battle.
  • 他认为在战场上开始使用杀人机器人之前就应该对其进行禁止。
  • And that is the Technology Report from VOA Learning English. For more about our reports, visit our website at learningenglish.voanews.com.I'm June Simms.
  • 这里是美国之音慢速英语科技报道,登陆learningenglish.voanews.com获得更多报道,我是琼·希姆斯。


扫描二维码进行跟读打分训练
Q#;nu;AvsLK96v3^^

o7R!qywtdbx=P~IM

From VOA Learning English, this is the Technology Report.

CdQewIQgCb

An international coalition is calling for a ban on fully autonomous weapons known as "killer robots." The 45-member coalition proposed the ban to governments last month at a meeting of the United Nations Convention on Conventional Weapons in Geneva, Switzerland.

a[Gy|dgnLNQZZ5Uu%j|

The Campaign to Stop Killer Robots wants the U.N. organization to add fully autonomous weapons to its work program in 2014.

WaC[G1HXDTPNJc3Fpi@

Scientists have yet to develop fully autonomous killer robots. However, technology is moving toward increasing autonomy. Such weapons would identify and attack targets without human assistance.

^zRKDr=nL]0n2LdIMVl


A robot is pictured in front of the Houses of Parliament and Westminster Abbey as part of the Campaign to Stop Killer Robots in London, April 23, 2013.

u]S+4Uzs+i

Noel Sharkey is a founding member of the Campaign to Stop Killer Robots. He also chairs the International Committee for Robot Arms Control. Mr Sharkey says autonomous weapons should be banned.

YR]pbjkz%cI8

"The big problem for me is that there are no robot systems that can discriminate between civilian targets and military targets unless they are very, very clearly marked in some way...so, the idea of having robots going out into the field and selecting their own targets is to me, is just horrifying. It cannot work, " said Sharkey.

alXW;,!r)Al

Activists say robotic systems with different degrees of autonomy are already in use by Britain, Israel, the United States and South Korea. They believe China and Russia are also moving toward these systems.

aVl4+F@Lw9WPtcAFUHU^

Steve Goose is a member of the campaign, he also directs the Arms Division at Human Rights Watch. He warns the killer robots will become a reality unless governments act now to ban them. He says the world should oppose the weapons system that would be able to identify and attack targets mechanically.

xH~wo9lYte)6A+,=VK

He believes such a system crosses a basic moral and ethical line.

#n,dzySa|t=~!

"Robotic weapons systems should not make life and death decisions on the battlefield. That is simply inherently wrong. So, they need to be banned on ethical grounds. We think they also need to be banned on legal grounds. If and when a killer robot commits a war crime, violates international humanitarian law...who would be held accountable, who would be responsible for that violation?" said Goose.

Gd@OzH,Nw54p4.2

He adds that in recent months, fully autonomous weapons have gone from a little known issue to one that is commanding worldwide attention. He says that since last May, 34 countries have openly expressed concern about the dangers the weapons present.

s^]q,qZRBW1xC![.

Mr Goose notes that in 1995, the Convention on Conventional Weapons added a new policy to the treaty, which barred the use of blinding lasers. He believes killer robots could become the second such weapon to be banned before it is ever used in battle.

SOufN,MR3_U2+nuYfVXF

And that is the Technology Report from VOA Learning English. For more about our reports, visit our website at learningenglish.voanews.com.I'm June Simms.

u;dvuQdlP9-jCa

文本来自51voa,译文属可可原创,仅供学习交流使用,未经许可请勿转载

BWDDOeFIC)nRK1d2uae

DtN;k-KPVj]C#-EWRYE%+~A2UTE%G=G.zbe-GFIq2E&2+,

重点单词   查看全部解释    
dilemma [di'lemə]

想一想再看

n. 困境,进退两难

联想记忆
civilian [si'viljən]

想一想再看

adj. 平民的
n. 罗马法专家,平民

联想记忆
laser ['leizə]

想一想再看

n. 激光,镭射

 
military ['militəri]

想一想再看

adj. 军事的
n. 军队

联想记忆
identify [ai'dentifai]

想一想再看

vt. 识别,认明,鉴定
vi. 认同,感同身

 
oppose [ə'pəuz]

想一想再看

vt. 反对,反抗,使对立,使对抗

联想记忆
autonomy [ɔ:'tɔnəmi]

想一想再看

n. 自治,自治权,自主

联想记忆
violation [.vaiə'leiʃən]

想一想再看

n. 违反,违背,妨碍

 
ethical ['eθikəl]

想一想再看

adj. 道德的,伦理的,民族的

 
control [kən'trəul]

想一想再看

n. 克制,控制,管制,操作装置
vt. 控制

 

发布评论我来说2句

    最新文章

    可可英语官方微信(微信号:ikekenet)

    每天向大家推送短小精悍的英语学习资料.

    添加方式1.扫描上方可可官方微信二维码。
    添加方式2.搜索微信号ikekenet添加即可。