搜题
网友您好,请在下方输入框内输入要搜索的题目:
搜题
题目内容 (请给出正确答案)
提问人:网友pclove 发布时间:2022-01-06
[主观题]

Blay Whitby pointed out that people only began to be aware of the safety issue ______.A.wh

Blay Whitby pointed out that people only began to be aware of the safety issue ______.

A.when household robots were invented

B.when robots were in wide use

C.after some grave accidents

D.ten years or so ago

简答题官方参考答案 (由简答题聘请的专业题库老师提供的解答)
查看官方参考答案
更多“Blay Whitby pointed out that people only began to be aware of the safety issue ______.A.wh”相关的问题
第1题
Blay Whitby thinks that "safety-critical computing"A.is in its infancy.B.has its heyday.C.

Blay Whitby thinks that "safety-critical computing"

A.is in its infancy.

B.has its heyday.

C.undergoes rapid development.

D.can hardly be improved.

点击查看答案
第2题
Whitby把气溶胶粒子分成了哪几个模态,它们的特征是什么?
点击查看答案
第3题
Ethelbert, King of Kent, provided St.Augustine with a house for his followers in ().

A.Canterbury

B.Westminster

C.Whitby

D.London

点击查看答案
第4题
Whitby等人依据大气颗粒物按表面积与粒径分布的关系得到了三种不同类型的粒度模,它们分别为:()、()、()。

点击查看答案
第5题
In 1981 Kenji Urada, a 37-year-old Japanese factory worker, climbed over a safety fence at
a Kawasaki plant to carry out some maintenance work on a robot. In his haste, he failed to switch the robot off properly. Unable to sense him, the robot's powerful hydraulic arm kept on working and accidentally pushed the engineer into a grinding machine. His death made Urada the first recorded victim to die at the hands of a robot.

This astounding industrial accident would not have happened in a world in which robot behavior. was governed by the Three Laws of Robotics drawn up by Isaac Asimov, a science fiction writer. The laws appeared in I, Robot, a book of short stories published in 1950 that inspired a Hollywood film. But decades later the laws, designed to prevent robots from harming people either through action or inaction, remain in the realm of fiction.

With robots now poised to emerge from their industrial cages and to move into homes and workplaces, roboticists are concerned about the safety implications beyond the factory floor. To address these concerns, leading robot experts have come together to try to find ways to prevent robots from harming people. "Security, safety and sex are the big concerns," says Henrik Christensen, chairman of the European Robotics Network at the Swedish Royal Institute of Technology in Stockholm, and one of the organisers of the new roboethics group. Should robots that are strong enough or heavy enough to crush people be allowed into homes? Should robotic sex dolls resembling children be legally allowed?

These questions may seem esoteric but in the next few years they will become increasingly relevant, says Dr. Christensen. According to the United Nations Economic Commission for Europe's World Robotics Survey, in 2002 the number of domestic and service robots more than tripled, nearly outstripping their industrial counterparts. Japanese industrial firms are racing to build humanoid robots to act as domestic helpers for the elderly, and South Korea has set a goal that 100% of households should have domestic robots by 2020. In light of all this, it is crucial that we start to think about safety and ethical guidelines now, says Dr. Christensen.

So what exactly is being done to protect us from these mechanical menaces? "Not enough," says Blay Whitby, an artificial-intelligence expert at the University of Sussex in England. This is hardly surprising given that the field of "safety-critical computing" is barely a decade old, he says. But things are changing, and researchers are increasingly taking an interest in trying to make robots safer. One approach, which sounds simple enough, is to try to program them to avoid contact with people altogether. But this is much harder than it sounds. Getting a robot to navigate across a cluttered room is difficult enough without having to take into account what its various limbs or appendages might bump into along the way.

Regulating the behavior. of robots is going to become more difficult in the future, since they will increasingly have self-learning mechanisms built into them, says Gianmarco Veruggio, a roboticist at the Institute of Intelligent Systems for Automation in Genoa, Italy. As a result, their behavior. will become impossible to predict fully, he says, since they will not be behaving in predefined ways but will learn new behavior. as they go.

The word "astounding" in the second paragraph is closest in meaning to

A.gullible.

B.awesome.

C.gruesome.

D.stupendous.

点击查看答案
第6题
The enemy were compelled to ______ their arms.Alay downBlay outClay backDlay against

The enemy were compelled to ______ their arms.

Alay down

Blay out

Clay back

Dlay against

点击查看答案
第7题
根据Whitby等人的粒子模型,可把大气颗粒物表示 结构;

A.爱根(Aitken)核模

B.分子核膜

C.积聚模

D.粗粒子模

点击查看答案
第8题
Trust Me, I Am a RobotRobot safety: as robots move into homes and offices, ensuring that t

Trust Me, I Am a Robot

Robot safety: as robots move into homes and offices, ensuring that they do not injure people will be vital. But how?

The incident

In 1981 Kenji Urada, a 37-year-old Japanese factory worker, climbed over a safety fence at a Kawasaki plant to carry out some maintenance work on a robot. In his haste, he failed to switch the robot off properly. Unable to sense him, the robot's powerful hydraulic arm kept on working and accidentally pushed the engineer into a grinding machine. His death made Urada the first recorded victim to die at the hands of a robot.

This gruesome industrial accident would not have happened in a world in which robot behaviour was governed by the Three Laws of Robotics drawn up by Isaac Asimov, a science-fiction writer. The laws appeared in I, Robot, a book of short stories published in 1950 that inspired a recent Hollywood film. But decades later the laws, designed to prevent robots from harming people either through action or inaction, remain in the realm of fiction.

Indeed, despite the introduction of improved safety mechanisms, robots have claimed many more victims since 198 I. Over the years people have been crushed, hit on the head, welded and even had molten aluminium poured over them by robots. Last year there were 77 robot-related accidents in Britain alone, according to the Health and Safety Executive.

More related issues

With robots now poised to emerge from their industrial cages and to move into homes and workplaces, roboticists are concerned about the safety implications beyond the factory floor. To address these concerns, leading robot experts have come together to try to find ways to prevent robots from harming people. Inspired by the Pugwash Conferences--an international group of scientists, academics and activists founded in 1957 to campaign for the non-proliferation of nuclear weapons—the new group of robo-ethicists met earlier this year in Genoa, Italy, and announced their initial findings in March at the European Robotics Symposium in Palermo, Sicily.

"Security, safety and sex are the big concerns," says Henrik Christensen, chairman of the European Robotics Network at the Swedish Royal Institute of Technology in Stockholm, and one of the organisers of the new robo-ethics group. Should robots that are strong enough or heavy enough to crush people be allowed into homes? Is "system malfunction" a justifiable defence for a robotic fighter plane that contravenes the Geneva Convention and mistakenly fires on innocent civilians? And should robotic sex dolls resembling children be legally allowed?

These questions may seem esoteric but in the next few years they will become increasingly relevant, says Dr. Christensen. According to the United Nations Economic Commission for Europe's World Robotics Survey, in 2002 the number of domestic and service robots more than tripled, nearly surpassing their industrial counterparts. By the end of 2003 there were more than 600,000 robot vacuum cleaners and lawn mowers — a figure predicted to rise to more than 4m by the end of next year. Japanese industrial firms are racing to build humanoid robots to act as domestic helpers for the elderly, and South Korea has set a goal that 100% of households should have domestic robots by 2020. In light of all this, it is crucial that we start to think about safety and ethical guidelines now, says Dr. Christensen.

Difficulties

So what exactly is being done to protect us from these mechanical menaces? "Not enough," says Blay Whitby, an artificial-intelligence expert at the University of Sussex in England. This is hardly surprising given that the field of "safety-critical computing" is barely a decade old, he says. But things are changing, and researchers are increasingly taking an interest in trying to make robots safer.

Regulating the behaviour of robots is going

A.Y

B.N

C.NG

点击查看答案
第9题
______was pointed above, this substance can be used as a substitute.A.ItB.ThatC.WhatD.As

______was pointed above, this substance can be used as a substitute.

A.It

B.That

C.What

D.As

点击查看答案
第10题
A.pointedB.steepC.verticalD.sharp

A.pointed

B.steep

C.vertical

D.sharp

点击查看答案
重要提示: 请勿将账号共享给其他人使用,违者账号将被封禁!
查看《购买须知》>>>
重置密码
账号:
旧密码:
新密码:
确认密码:
确认修改
购买搜题卡查看答案
购买前请仔细阅读《购买须知》
请选择支付方式
微信支付
支付宝支付
点击支付即表示你同意并接受《服务协议》《购买须知》
立即支付
搜题卡使用说明

1. 搜题次数扣减规则:

功能 扣减规则
基础费
(查看答案)
加收费
(AI功能)
文字搜题、查看答案 1/每题 0/每次
语音搜题、查看答案 1/每题 2/每次
单题拍照识别、查看答案 1/每题 2/每次
整页拍照识别、查看答案 1/每题 5/每次

备注:网站、APP、小程序均支持文字搜题、查看答案;语音搜题、单题拍照识别、整页拍照识别仅APP、小程序支持。

2. 使用语音搜索、拍照搜索等AI功能需安装APP(或打开微信小程序)。

3. 搜题卡过期将作废,不支持退款,请在有效期内使用完毕。

请使用微信扫码支付(元)

订单号:

遇到问题请联系在线客服

请不要关闭本页面,支付完成后请点击【支付完成】按钮
遇到问题请联系在线客服
恭喜您,购买搜题卡成功 系统为您生成的账号密码如下:
重要提示:请勿将账号共享给其他人使用,违者账号将被封禁。
发送账号到微信 保存账号查看答案
怕账号密码记不住?建议关注微信公众号绑定微信,开通微信扫码登录功能
警告:系统检测到您的账号存在安全风险

为了保护您的账号安全,请在“简答题”公众号进行验证,点击“官网服务”-“账号验证”后输入验证码“”完成验证,验证成功后方可继续查看答案!

- 微信扫码关注简答题 -
警告:系统检测到您的账号存在安全风险
抱歉,您的账号因涉嫌违反简答题购买须知被冻结。您可在“简答题”微信公众号中的“官网服务”-“账号解封申请”申请解封,或联系客服
- 微信扫码关注简答题 -
请用微信扫码测试
欢迎分享答案

为鼓励登录用户提交答案,简答题每个月将会抽取一批参与作答的用户给予奖励,具体奖励活动请关注官方微信公众号:简答题

简答题官方微信公众号

简答题
下载APP
关注公众号
TOP