Beruflich Dokumente
Kultur Dokumente
The IEEE Society for Social Implications of Technology (SSIT) invites you to contribute to the IEEE International Symposium on
Technology and Society (ISTAS) 2018, hosted by the George Washington University School of Engineering and Applied Science.
ISTAS is a multi-disciplinary and interdisciplinary forum for engineers, policy makers, entrepreneurs, philosophers, researchers, social
scientists, technologists, and polymaths to collaborate, exchange experiences, and discuss social implications of technology.
We welcome proposals for papers and practitioner presentations, panels, and workshop sessions focused on technology’s
relationship to social issues ranging from the economic and ethical to the cultural and environmental; in particular, we seek
submissions engaging with the following topics:
x The social implications of technology as they relate to SSIT’s &ŝǀĞWŝůůĂƌƐ͗Sustainable Development and Humanitarian
Technology, Ethics/Human Values, Universal Access to Technology, Societal Impacts, and Protecting the Planet.
For more information about ISTAS, submission guidelines, and updates, visit our website:
ŚƚƚƉ͗ͬͬƚĞĐŚŶŽůŽŐLJĂŶĚƐŽĐŝĞƚLJ͘ŽƌŐͬĞǀĞŶƚͬϮϬϭϴͲŝĞĞĞͲŝŶƚĞƌŶĂƚŝŽŶĂůͲƐLJŵƉŽƐŝƵŵͲƚĞĐŚŶŽůŽŐLJͲƐŽĐŝĞƚLJͲŝƐƚĂƐͬ
Paul M. Cunningham
C
tagline “Advancing
Technology for Hu-
manity,” it is hardly
surprising that many
IEEE members are actively engaged
in different forms of volunteerism
Development
addressing social challenges at com-
munity level, at home and abroad.
The IEEE Humanitarian and
Philanthropic Opportunities (H&P)
Initiative was launched at Sections
Congress 2017 in Sydney by the As an IEEE technical Society ly recognized. The sheer scale of
IEEE Foundation and IEEE Humani- whose focus on all aspects of soci- opportunities to address the U.N.
tarian Activities Committee (HAC). etal implications of technology com- Sustainable Development Goals
The objective of H&P is to help IEEE plements the technical activities of (SDGs) at home and abroad requires
members identify opportunities all other IEEE Societies, SSIT mem- us to accelerate expanding our pro-
to volunteer their “time, talent or bers have a proud history of contri- grams and continue growing our
treasure” in the sustainable devel- butions to sustainable development global footprint. The IEEE Society on
opment and humanitarian technol- and humanitarian technology. We Social Implications of Technology
ogy space based on their interests, have long focused on addressing eth- (SSIT) is truly demonstrating lead-
expertise, and availability. ical implications, interdependencies, ership in supporting operationaliza-
Currently there are 12 groups context, and socio-cultural norms tion of the IEEE Tagline, “Advancing
across IEEE involved in H&P, offering that are essential to avoid unintend- Technology for Humanity.”
team and individual volunteer oppor- ed and unanticipated consequenc-
tunities of different durations for mem- es. One of our core strengths as a Call for Volunteers
bers at different stages of their careers, community has been our collabora- I invite you to help SSIT continue
ranging from young professional or tive, partnership-based approach. to make a difference, particularly
student, mid-career, pre-retirement, SSIT IST-Africa SIGHT members in the areas of sustainable develop-
or retirement. Participating groups from the IST-Africa Institute, Univer- ment and humanitarian technology.
include the IEEE Foundation, IEEE sity of Gondar, Strathmore Univer- Volunteer opportunities include:
Humanitarian Activities Committee sity, Chancellor College, and Nelson ■ Serving your local community
(HAC), IEEE SIGHT (Special Interest Mandela University have success- through an existing or new SSIT
Group on Humanitarian Technology), fully built trust-based relationships Chapter.
IEEE Eta Kappa Nu, EPICS in IEEE, with healthcare clinics in resource ■ Contributing to the work of SSIT’s
IEEE Smart Village, IEEE Life Members constrained environments in Ethio- committees (including our Stan-
Committee, IEEE History Center, IEEE pia, Kenya, Malawi, and South Afri- dards committee).
Power & Energy Society Scholarship ca. They are providing digital literacy ■ Volunteering to host SSIT Distin-
Plus Initiative, IEEE Internet Initiative training and supporting infrastruc- guished Lecturers.
(3i), IEEE Empower a Billion Lives, and ture development with the objective ■ Submitting articles or review sub-
the IEEE-USA Community Outreach of supporting technology adoption missions to IEEE Technology and
Initiative (Move Project). to strengthen primar y health- Society Magazine.
care delivery.
Digital Object Identifier 10.1109/MTS.2018.2804961
The contribution of our active
Date of publication: 2 March 2018 and committed volunteers is wide- (continued on page 21)
* Refereed articles.
19 22 70
Editorial Commentary
5 One at a Time, and All at Once 15 Connected Vehicle Security Vulnerabilities
Jeremy Pitt Yoshiyasu Takefuji
Jeremy Pitt
One at a Time,
Wicked Problems
and Collective Action
jour ney of a thou-
Drone Warfare
Drone Warfare (War and Conflict in the Modern World)
By John Kaag and Sarah Kreps. Malden, MA: Polity Press, 2014, 188 pages.
ver since the discov- on normative questions about drone in combat zones (Afghanistan, Iraq,
E
ery of the rock, mili- warfare. Moreover, the book’s dis- and Libya). Outside of combat zones
tary technology has cussion addresses these questions (chiefly in Pakistan, Yemen, and
made it possible to almost exclusively with regard to the Somalia) these take place under the
kill from a distance. United States, the largest and most aegis of the CIA. The U.S. began using
While rocks remain in use, millen- prolific user of drones. The U.S. is drones to kill suspected terrorists in
nia of design iterations have brought also, the authors hope, in the best the aftermath of the September 11th
us the laser-guided AGM-114 Hell- position to establish norms and best attacks when Congress enacted the
fire missile, which can be fired by practices regarding drones by virtue Authorization for the Use of Military
remote control from a MQ-9 Reaper of its military might and internation- Force (AUMF), handing the George
drone by an operator as far away as al stature. W. Bush administration extremely
1150 miles. Already, thousands of While drones are still a developing open-ended authority to target those
people have been killed in this way. technology, the drone arsenal pos- who “planned, authorized, commit-
Drone Warfare, by philosopher John sessed by the United States already ted, or aided the terrorist attacks.”
Kaag and political scientist Sarah encompasses a wide variety of drone The Obama administration contin-
Kreps, invites us to consider the types with a correspondingly exten- ued and expanded upon the drone
implications of this for politics and sive range of potential missions. The policies of its predecessor, conduct-
authors have wisely nar- ing more than one hundred drone
rowed their focus to perhaps strikes in Afghanistan alone in
the most troubling mission, 2008 and more still in the following
the one most fraught with years. Drone strikes outside of com-
legal, political, and moral bat zones also increased markedly
First up for consideration is i mplic at ions: the use of under Obama. Some of those killed
how drones undermine political drones to kill suspected ter- have been American citizens, most
rorists. These attacks can be famously Anwar Al-Awlaki. The AUMF
accountability. “targeted killings,” aimed at remains in effect, and is now being
a known individual, or “signa- used to justify operations against
ture strikes,” aimed at some- groups that did not even exist when
domestic accountability, for interna- one whose behavior fits a “signature” it was passed.
tional law, and for ethics. or profile of suspected terrorists. With some basic facts briefly es-
Despite its title, Drone Warfare A recurring frustration I encoun- tablished, the authors turn to norma-
is not a comprehensive treatment tered with the book (one no doubt tive matters. First up for consideration
of drone warfare. There is very little shared by its authors!) was the pau- is how drones, which enable govern-
discussion of the technical aspects city of detailed, accurate data about ments to engage in covert warfare
of drones beyond what is necessary drone use. While much of the book’s with minimal risk of loss of life on
in order to understand their uses. discussion is therefore unavoidably the part of their service members,
With that information briefly estab- speculative, the authors do their undermine political accountability.
lished, the book focuses squarely best to lay out what facts are avail- According to one influential strand of
able about drone strikes on suspect- democratic theory (which the authors
Digital Object Identifier 10.1109/MTS.2018.2804962
ed terrorists. The U.S. Department trace back to Kant’s essay on per-
Date of publication: 2 March 2018 of Defense carries out drone strikes petual peace), the high cost of war
Drowning in Information,
Starving for Knowledge
Information Overload Paradox: Drowning in Information, Starving for Knowledge.
By L. V. Orman. Seattle, WA: Create Space Independent Publishing, 2016, 190 pages.
I
is not a new pheno- “cheap for expensive,” “complex tions to compete so we do not end
menon, but a part for simple,” and “formal for infor- up with quick, irreversible substitu-
and parcel of mod- mal,” a large quantity of infor- tions; whereas, to prevent obsoles-
ern life. In this vein, mation drives out high-quality cence, we need to practice “cultural
Georg Simmel earlier suggested in information. protectionism.” He argues that orga-
The Metropolis and Mental Life [1] 2) Obsolescence: Changes in tech- nizations such as state, family, and
that overwhelming stimuli trans- nologies often require orga - church are monopolies with breed-
form the psyche of urban individu- n i zationa l adapt ations a nd ing grounds for irreversible substi-
als and help them develop a blasé specialization. Consequently, tutions. Even if these organizations
attitude. Social scientists have also old but useful information, adopt inefficient practices, the prac-
sought to understand “information methods, and practices get lost. tices become almost impossible to
overload,” its determinants, conse- 3) Competition: Information over- change. So, we need competition,
quences, and remedies ever since. load makes information a com- such that practices come about
An “information overload” keyword petitive weapon. Social actors through small-scale experimenta-
search in Google Books today yields competing for limited resources tions. While Orman’s illustrations
300 000+ hits! So is there anything might mislead each other through are appealing, he does not note why
new in Orman’s book? deliberate misinformation. such organizations survive despite
Orman poses “information over- Although it is not fully clear how being inefficient monopolies. The
load” as a paradox and carries out the three mechanisms fit together causes are, of course, multifaceted,
the daunting task of drawing from and where the boundary of their and have been subject to debate.
a large body of literature to give us explanatory power is drawn, Orman For instance, in a recent provoca-
three mechanisms through which does a great job in illustrating them tive manuscript, “Why Nations Fail,”
such paradox arises. The paradox is individually. He also makes the Acemoglu and Robinson show
that technologies help us know more, problem of information overload that states are inefficient because
but in the process, we know less. In appear manageable and solvable. extractive political institutions in
a Simmelian world, this is not an However, when it comes to solu- them allow some people an unequal
entirely novel proposition. However, tions, Orman leaves us with some opportunity to usurp resources and
Orman’s simplification of the prob- contradictions, and sidesteps some power [2]. So, if Orman is to pre-
lem along with pertinent evidence existing solutions as well. We will vent states or nations from adopting
makes the mechanisms a compel- examine these solutions and pro- inefficient, irreversible practices,
ling narrative. As we are moving fast pose to consider the interaction of he should address the causes (e.g.,
towards “ubiquitous computing,” three mechanisms. Acemoglu and Robinson’s extractive
Orman’s effort is timely. In summary, First, he prescribes “liberalism” political institutions).
Orman’s mechanisms are as follows: and “protectionism” at the same Second, Orman suggests that use
time without addressing why appar- of “trust partners” can save people
Digital Object Identifier 10.1109/MTS.2018.2804981
ently inefficient organizations may from misinformation, but does not
Date of publication: 2 March 2018 monopolize social lives. According elaborate on the downside of such
Rui Costa
The Internet of
obile technology isn’t
M
just in your pocket
24/7. It’s everywhere
Moving Things
around us today, with
its continual byprod-
uct — data — trailing us everywhere
we go. The great nexus of this 21st-
century trend isn’t really your smart-
phone — it’s the city where you live,
work, and play. Over half of the world’s
population lives in urban areas, which
are expected to grow to accommo-
date an additional 2.5 billion people
[1] over the next three decades.
While critics argue that 24/7 devo-
tion to our devices can drive us apart,
others point to how myriad streams
of emerging data culled from every
moving (and connected) thing pulsat-
ing through our cities — cars, buses,
bicycles and more — can ultimately
reshape and optimize our urban areas
to make them better places to live.
That presents urban leaders
today with enormous challenges,
but also big opportunities to tap
into the mobile-technology boom to
ISTOCK
Yoshiyasu
Takefuji
Connected
n t he h i s t or y of
I
mandatory regula-
tion of computer-
Vehicle Security
i z e d veh icle s, an
E - L e t ter ent it led,
“Black box is not safe at all,” was
published in Science [1] in 2017. It
mentioned that on-board diagnos-
tics (OBD-II) specifications were
made mandatory for all cars sold in
Vulnerabilities
the United States in 1996. The Euro-
pean Union made European OBD
(EOBD) mandatory for all gasoline
(petrol) vehicles sold in the European
Union starting in 2001.
The problem is that the OBD-II and
EOBD specifications contain “black
boxes” that cannot be fully tested by
car manufacturers. There is also no
security provided in the OBD-II and
EOBD specifications. In other words,
for more than fifteen years, we have
been neglecting security problems of
the naked (unsecured) cars [1].
Before considering autonomous
vehicles [2], we must understand such
unsecure mandatory specifications.
Why have we been forced to live with
black-box testing without understand-
ISTOCK/ET1972
F
to virtual assistants,
industrial machines
Ethical Implications
to drones, today’s
robot creations, in-
cluding those in automation, are
used to perform any number of spe-
cific tasks. These undertakings are
repetitive in nature and suggest that
we are still a long way from manufac-
turing “all-purpose” utility robots.
History teaches us much about
technological innovation and the
perils of over-promising. We are, for
example, still a far cry away from the
headlines of the first computers that
promised so much but delivered
only computational trajectories for
the military. In fact, seventy years
later, we will see headlines similar
■ Reviewing submissions to IEEE over the coming years. Funds will UCLA who have finished their terms
ISTAS, Norbert Weiner, IEEE Eth- be invested in further strengthening of service on the SSIT BoG. I welcome
ics, IST-Africa Week, and other and expanding volunteer activities. Charmayne Hughes of the Health
SSIT supported conferences. Options to financially support SSIT Equity Institute and an Associate Pro-
■ Supporting activities of the IEEE volunteer activities include: fessor at San Francisco State Univer-
SSIT IST-Africa SIGHT in IST-Afri- ■ Donate to SSIT online https:// sity; Heather Love, Assistant Professor
ca Partner Countries. ieeefoundation.org/ieee_ssit. of English at the University of South
■ Representing SSIT on IEEE com- ■ Mail a check payable to the “IEEE Dakota; and Jay Pearlman, currently
mittees (TAB, BoD, Standards, Foundation — SSIT Fund” to: IEEE adjunct Professor at the University of
Future Directions Initiative). Foundation, 445 Hoes Lane, Pis- Colorado, who have all been elected
■ Serving on the SSIT Board of cataway, NJ 08854, U.S.A. to serve three-year terms on the
Governors. ■ Asking your employer to match SSIT Board of Governors beginning
If any of these opportunities are of your personal donation. in 2018.
potential interest or if you would like ■ Donate in honor or memory of
to recommend someone, please someone who has touched your Author Information
contact me (Subject: Volunteer for life or others. Paul M. Cunningham, 2017–2018
IEEE SSIT — <name>) and I will direct ■ Direct a gift to the “IEEE Foun- IEEE-SSIT President, is President &
you to the responsible team. If you dation — SSIT Fund” from your CEO, IIMC (Ireland); Director, IST-
have not received a response to a donor advised fund, foundation Africa Institute (www.IST-Africa.org);
previous offer to volunteer, please or family office. Adjunct/Visiting Professor, Interna-
accept my sincere apologies and ■ Remember SSIT in your will. tional University of Management
contact me again so I can assist you. (Namibia); and Visiting Senior Fel-
Thanks and Welcome low, Wrexham Glyndŵr University
Call for Donations, I would like to acknowledge the enor- (Wales). Paul is 2018 Chair, IEEE
Gifts, and Bequests mous contribution made by Subrata Humanitarian Activities Committee
SSIT is launching a fundraising cam- Saha of SUNY Downstate Medical Cen- and serves on the IEEE Global
paign focused on securing the level of ter, Brooklyn, NY, and John Villasenor, Public Policy Committee. Email:
resources required to scale activities Professor of Electrical Engineering at pcunningham@ieee.org.
Humanizing
Human-Robot
Interaction
I
Alessandra Sciutti, n conjunction with what is often called the industry
4.0, the new machine age, or the rise of the robots,
Martina Mara, the authors of this paper have each experienced the
Vincenzo Tagliasco, and following phenomenon. At public events and round-
table discussions, among our circles of friends, or
Giulio Sandini during interviews with the media, we are asked on a
surprisingly regular basis: “How must humankind
Digital Object Identifier 10.1109/MTS.2018.2795095
adapt to the imminent process of technological change? What
Date of publication: 2 March 2018 do we have to learn in order to keep pace with the smart new
machines? What new skills do we need to under- 70 percent of people think that robots will steal peo-
stand the robots?” ple’s jobs and around 90 percent say that the imple-
We think that these questions are being posed from mentation of robots in society needs careful management
the wrong point of view. It is not that we, the ever grow- [1]. In relation to these numbers, a much smaller but
ing number of robot users, should be the ones who still sizeable population could be called “techno-
need to acquire new competencies. On the contrary, we phobes” or “robophobes” [2], defined as individuals
want to ask how the robots that will soon be popping up who are anxious towards smart machines on a person-
all over the place can adjust to their human interaction al level. The Chapman University Survey of American
partners in better ways. What do the robots have to Fears [3] revealed in this regard that 29 percent of U.S.
learn to be considerate of people and, no less impor- residents reported to be very afraid or afraid of robots
tant, be perceived as considerate by people? Which replacing workforce, a number comparable to the
skills do they need, what do they have to learn to make occurrence of the fear of public speaking in the U.S.
cooperation with humans possible and comfortable? population. Furthermore, 22 percent of participants
Coming from various disciplinary backgrounds root- indicated being very afraid or afraid of artificial intelli-
ed in robotics, cognitive science, psychology, and com- gence, and 19 percent of “technology I don’t under-
munication, these are the shared questions on which stand” [3]. The imagined substitution of human beings
we have based our approach to humanize human-robot by intelligent artificial agents has been repeatedly
interaction (HRI). It is an approach that ultimately leads described as a strong fear, reaching from the fear of job
us to the necessity of mutual understanding between loss relevant to everyday life [2], [4] to much vaguer
humans and machines — and therefore to a new design fears of an artificial “superintelligence” [38] that on its
paradigm in which collaborative machines not only own develops doubtful intentions, and, ultimately, a
must be able to anticipate their human partner’s goals “robocalyptic” end of humankind [5]. Science fiction, of
but at the same time enable the human partner to antic- course, plays a role here. While some fictional stories
ipate their own goals as well. have been shown to generate meaning and thereby
We will be elaborating on several important design increaserecipients’ acceptance of robotic technology
factors in each respective area. Even if they don’t con- [35], [37], many highly popular movies such as The Ter-
stitute an all-encompassing concept, we are convinced minator, Blade Runner, or Ex Machina [38] circulate
that they build a solid basis and an effective strategy dystopian outlooks and frequently encourage the audi-
for the development of humane robots. (We adopt here ence to envision a militarized future of human-robot
the “Cambridge Dictionary definition of humane: “show- relations [39].
ing kindness, care, and sympathy towards others, Coming back to more contemporary, non-fictional
especially those who are suffering.”) Moreover, we think developments in robotics, various fears and ethical
that robots that are designed for mutual understand- concerns have been raised in view of so-called social
ing can also make a positive impact on the subjective or “emotional” robots, meant to be used, e.g., for the
psychological experience of human-robot interactions care of children or the elderly. A number of scholars
and enhance public acceptance of robotic technolo- and study participants have expressed their worries
gies in general. about such robotic companions as they might contrib-
ute to social isolation by reducing the amount of time
People’s Fears of Robots spent with other humans, lead to a loss of privacy and
At present there is still much skepticism on the part of liberty or to emotional manipulation of lonely, sensitive
some groups of potential users towards the increasing persons [38], [40], [41], [43].
deployment of robots in domestic environments and, Besides being afraid of robotic surrogates or caregiv-
exceedingly, in workplaces. According to a recent large- er robots, however, there are other types of fears about
scale survey in the European Union, approximately robots that have been noted as relevant in the literature.
will be decisive in determining Robots, and in particular humanoid robots, can play an
important role in this effort, as they are a valuable tool
whether people will accept robots for investigating controllable, repetitive dynamics of
human interactions, to derive and validate models of
in their societies. human social behavior [34].
We posit that the design of humane robots will
bring concrete advantages to society, and that they
will change the common perceptions of robots. The
the (powerful) robot’s shoulders and not only on those more people know about robots, the less they fear
of the human. As a result, experiencing mutual adapta- them [2], and mutual understanding between human
tion during the interaction will make the robot behavior and robots increases the predictability and legibility
much more predictable and acceptable, addressing of the machines, fostering a more relaxed and natu-
many of the fears caused by current uncertainties about ral coexistence.
these machines. Therefore, humanizing human-robot interactions will
If more humane interactions are established, it will be decisive in determining whether people will accept
become more and more evident that robots, rather robots in their societies, and how close we will be to a
than replacing us, might support humans, performing future in which humankind and robot-kind can co-exist
tasks we don’t like. Beyond replacing our household in safe and peaceful ways.
appliances, as already some robotic vacuum cleaners
or lawn mowers do, robots might be assigned to pro- Acknowledgment
gressively more complex and relevant duties, such as This work was written in the framework of the European
providing support to the elderly, in order to allow Project CODEFROR(FP7-PIRSES-2013-612555).
them a longer period of autonomous living in their
home. A humane robot won’t replace human contact, Author Information
but will provide concrete support in coping with physi- Alessandra Sciutti is the head of the Cognitive Robotics
cally demanding tasks that a person cannot perform and Interaction Laboratory of the Robotics, Brain and
alone anymore. Cognitive Sciences Department of the Italian Institute
At the same time, this can facilitate interaction with of Technology (IIT) in Genoa, Italy. Email: alessandra
peers. For instance, use of robots may be able to help .sciutti@iit.it.
mediate the access of seniors to novel digital communi- Martina Mara is the head of the RoboPsychology
cation channels, making the interaction with the devic- research department at the Ars Electronica Futurelab in
es intuitive. Already current robotic platforms presented Linz, Austria. She is also a member of the Austrian
as “personal robots” promise to move in this direction, Council for Robotics and a newspaper columnist.
by autonomously dealing with technical aspects of a Vincenzo Tagliasco is founder of the Bioengineering
video call and making the call process transparent to Group at the “Istituto di Elettrotecnica” of the Universi-
the users. ty of Genova. He is the initiator of the anthropomor-
Robots might also provide support to human thera- phic robotics activities described in this article. He
pists, since there is evidence suggesting that use of was the first Director of the Department of Communi-
robots can bring social benefits to clinical populations. cation Computer and System’s Science of the Universi-
For instance, in the case of autism or dementia, it has ty of Genova.
been shown that robots can facilitate group dynamics, Giulio Sandini is full professor of Bioengineering at
by increasing the occasions of interaction between the University of Genoa and Director of Research at the
patients, and leading to an increment in social exchang- Italian Institute of Technology where he leads the Robot-
es between patients and the therapists [32], [33]. ics, Brain and Cognitive Sciences Department.
The task of humanizing human and robot interac-
tions is challenging, however, because robots are cur- References
[1] European Commission, “Special Eurobarometer 427: Autono-
rently not as good as humans at adapting to their mous Systems.,” 2015. [Online]. Available: http://ec.europa.eu.
partner’s needs. There are various exa mples of [2] P. K. McClure, “‘You’re Fired,’ Says the Robot,” Soc. Sci. Comput.
humans learning to predict non-humane machines, Rev., p. 89443931769863, 2017.
[3] Chapman University, “America’s Top Fears,” 2015. [Online]. Avail-
although with some effort, e.g., think of workers deal- able: https://blogs.chapman.edu/wilkinson/2015/10/13/americas-
ing with complex technical devices. To provide robots top-fears-2015/.
T
Kathleen Richardson, he development of social robots
for children with autism has been
Mark Coeckelbergh, a growth field for the past 15 years.
Kutoma Wakunuma, This article reviews studies in robots
and autism as a neurodevelopmen-
Erik Billing, Tom Ziemke, tal disorder that impacts social-
Pablo Gómez, communication development, and
the ways social robots could help children with au-
Bram Vanderborght, and tism develop social skills. Drawing on ethics research
Tony Belpaeme from the EU-funded Development of Robot-Enhanced
Therapy for Children with Autism (DREAM) project (frame-
work 7), this paper explores how ethics evolves and
developed in this European project.
The ethics research is based on the incorporation of
multiple stakeholders’ perspectives including autism
advocacy; parents of children with autism; medical prac-
titioners in the field; and adults with Asperger’s dis-
order. Ethically, we propose that we start from the
position that the child with autism is a social being with
difficulties in expressing this sociality. Following from
this core assumption, we explore how social robots can
help children with autism develop social skills. We chal-
lenge the view that children with autism prefer technolo-
Digital Object Identifier 10.1109/MTS.2018.2795096
gies over other kinds of activities (exploring nature or
Date of publication: 2 March 2018 the arts), engagement with other living beings (animals),
Philosophical and
Social Dimensions
C
lark Glymour argued in 2004
Denotes is-a
Model of Yeast Biochemical
Nutrient Nutrient
Metabolism Part-of is-a Entity
Part-of Denotes
is-a Gene Gene Manipulates
Denotes
Plate Plate
Manipulates
Manipulates
Manipulates
Computer
Computer Computer
Software Hardware
associated with different ontologies: correspondence the- as a metaphysical question that cannot be answered, and
ories with realism, and pragmatism, verification, and coher- regards scientific theories as instruments of prediction
ence with idealism, anti-realism, or relativism [14]. [15], [16]. As physical devices, robot scientists necessarily
A robot scientist’s physical effectors (laboratory adopt a realist position as defined above. However, their
robots) can test the truth or falsehood of an abstract sci- approach to determining the truth of propositions is
entific proposition by specific physical experiments: an also consistent with that of anti-realism. Therefore, with
Abstract entity of type Proposition is assigned a truth robot scientists there would seem to be no difference
value by a Physical entity that participates in a speci- in regarding scientific theories as descriptions of reali-
fied Process. This is achieved through the designed iso- ty or as tools for prediction. This approach is related to
morphism between an abstract Denotation rule and a quietism [17].
physical Denotation process (Figure 2). This operation- The realism/anti-realism debate is closely connected
al approach to truth does not discriminate between cor- to another area of interest in the philosophy of science
respondence, pragmatism, verification, or coherence that is important in the design of robot scientists: the
theories of truth. For a human scientist these different relationship between observed and theoretical entities.
approaches may possibly inspire different ways of doing This subject has long been a matter of debate in the phi-
science, but given the current state of development of losophy of science, with some philosophers claiming
robot scientists it is unclear to us whether there is any that the distinction is not real and/or important [16]. We
operational difference between these approaches. argue that the distinction clarifies the robot scientist’s
One of the most debated questions in the philosophy reasoning, and that what are observed and theoretical
of science is that between realism and anti-realism. entities is relative to defined instrumentation.
Realism is “the viewpoint that accords to the objects of A common view in the philosophy of science is that
knowledge an existence that is independent of whether hypothesis formation necessarily requires human cre-
anyone is perceiving or thinking about them” [14]. The ativity [10]. This view has long been challenged by AI,
alternative position regards the existence of the real world [e.g., 19]. Most work within scientific discovery has
Sean F. Johnston
I
n 1966, a well-connec-
ted engineer posed a
provocative question:
will technology solve
all our social prob-
lems? He seemed to
imply that it would,
and soon. Even more conten-
tiously, he hinted that engineers
could eventually supplant social
scientists — and perhaps even
policy-makers, lawmakers, and
religious leaders — as the best
trouble-shooters and problem-
solvers for society [1].1
The engineer was the Direc-
tor of Tennessee’s Oak Ridge
National Laboratory, Dr. Alvin
Weinberg. As an active networker,
essayist, and contributor to gov-
ernment committees on science
FIGURE 1. Engineers and scientists as social problem-solvers
[source: New York Herald Tribune, 7 Aug 1945 (the day after Hiroshima), p.22].
1
Weinberg’s second speech on the topic was
more cautiously titled, and was reprinted
Digital Object Identifier 10.1109/MTS.2018.2795118 in numerous journals and magazines and
Date of publication: 2 March 2018 widely anthologized in university texts [2].
B
C The Voices of Technocracy
Journalists after the First World War christened mod-
ern culture “the Machine Age,” a period that vaunted
1920 1940 1960 1980 2000
the mechanization of cities and agriculture, industrial
FIGURE 2. Modern problem-solving rhetoric: Usage of the efficiency, “scientific management,” and most of all,
terms: A — “technological solution,” B — “technological fix,” and engineering solutions to modern problems [6], [7].
C — “technical fix,” according to Google n-gram analysis. Social progress became associated with applied
FIGURE 5. Alvin Weinberg teaching at the Oak Ridge Institute So closely was he identified with the concept that
for Nuclear Studies, 1946. Courtesy of Oak Ridge National Weinberg later characterized his career as that of a
Laboratory (ORNL). “technological fixer” [22]. (On the gestation of his ideas
see [23]).
Engineering and Applied Physics at Harvard. Brooks, Weinberg’s cogent articles did not present the polem-
too, had participated in nuclear reactor design and had ics of an interwar technocrat. He was cautious not to
an interest in applying scientific expertise for societal reveal his own political views, and avoided blaming politi-
benefit [18]. In an era of growing technological confi- cians and economists for societal imperfections. Instead,
dence, these hopeful analysts and their peers offered a Weinberg packaged the concept of the technological fix
rational route for societal improvement. in a form that invited responses from policy-makers.
Weinberg’s examples of technological fixes ranged
Weinberg’s Formulation: National Labs from common-sense solutions to provocative examples
for Societal Problems that seemed to lie on an ethically slippery slope. His
Alvin Weinberg’s optimism identified rational analysis easy-to-accept cases included consumer campaigner
and technological innovation as the key drivers of soci- Ralph Nader’s contention that engineering safer cars
etal progress. He argued that it was “the brilliant might provide quicker reduction of traffic deaths than
trying to change driving behaviors. Similarly, he argued
that cigarette filters were obviously better than legisla-
tion or health education campaigns to convince smok-
Weinberg promoted the belief that ers to give up cigarettes. But Weinberg also offered
more uncomfortable illustrations, for example the
technological innovation could notion of providing free air conditioners to literally cool
down urban tensions in American cities of the late
resolve any social issue as an 1960s, or the benefits of intrauterine devices (IUDs)
article of faith. to limit family size and economic deprivation [24].
As a member of government policy panels during
the Eisenhower, Kennedy, and Johnson administra-
tions, Weinberg gained the ears of legislators. Besides
advances in the technology of energy, of mass produc- the air-conditioning of slums, he lobbied for a wall
tion, and of automation,” not social systems or ideolo- between North and South Vietnam to limit enemy
gies, that “created the affluent society” [19]. incursions and thus scale down the war, although he
Weinberg (1915–2006, Figures 5 and 6) focused his quickly labeled it an “amateurish notion” after feed-
postwar career on the design, applications, and wider back from his peers [25], [26].2 Weinberg disclaimed
implications of nuclear reactors, becoming Director of other ideas — notably the general provision of soma
the Oak Ridge National Laboratory (ORNL) in 1955. His pills to relieve unhappiness, as portrayed in Aldous
high-profile position allowed Weinberg to represent not Huxley’s Brave New World, to suggest there were limits
just the nascent field of nuclear engineering, but also to how far technological fixes should go. He adapted to
the closer integration of technological innovation with
the goals of modern American society [20]. His network- 2
As Weinberg realized, his Vietnam wall — like Hadrian’s Wall across north-
ern Britain, the Great Wall of China, the Berlin Wall, and Donald Trump’s
ing provided him with experience as a senior adminis- proposed Mexican wall — is a technological fix for controlling population
trator in the new environment of publicly funded movements.
communities that opt for as the 1960s, opponents of the Vietnam War cited the
impotence of high-technology military systems against
technological fixes. the guerilla methods of a resourceful enemy [40]. If high
technology can be negated by such social and political
opposition, this seemed to suggest, why should techno-
logical fixes be trusted as a panacea for social and polit-
technological quick fixes were proposed as timely and ical problems?
reassuring solutions. Current options include oil-digest- For urban audiences over the same period, nuclear
ing microbes to deal with spills and industrial waste, technologies were increasingly cited as inherently dan-
biodegradable packaging, biotechnologies for fuel pro- gerous. For growing numbers, the field represented a
duction, and schemes for addressing anthropogenic cli- failure of government-managed safety certification pro-
mate change via geo-engineering [35]–[37]. cedures and a secretive industry. Similarly the che-
A second domain of problems attracting technology- mical industry, which had once been praised for
dominated responses is terrorism. As airplane hijack- technological fixes such as DDT to kill agricultural
ings proliferated during the early 1970s, and more pests and assure high crop yields, was now criticized
varied threats were identified after 2000, technologists as the source of widespread ecological damage [41].
responded with imaginative solutions ranging from low- Such technological criticism in America was pointed to
tech lockable cockpit doors, to technologies monitoring catastrophes such as super-tanker spills9 as represen-
Internet communications, to materials-detecting and tative of decision-making that prioritized the global
body-scanning systems. In the tradition of technological petrochemical economy. And while human health
fixes, these hardware solutions are rapid responses to remained the domain of technological fixes evincing
events that have relatively complex social, political, or the most widespread optimism, some topics raised
economic roots.8 growing disquiet among consumers. Among them was
an entirely new field for technological fixes: genetic
Quandaries and Implications engineering to design foods that could be longer-last-
of Technological Fixes ing or more nutritious (but not necessarily tastier), or
Such examples suggest support for the notion of tech- to cure inherited illnesses or extend human choices
nological fixes by large companies, governments and (but also introducing myriad moral questions alongside
the general population, as much as by engineers them- these new powers). Such cases were cited to argue that
selves [39]. But alongside unreflective acceptance of technological solutions streamlined analysis, priori-
clever technological solutions for urgent problems, tized economic, corporate, or consumer interests rath-
there is evidence of growing societal concerns about er than wider benefits, and under-estimated societal
some aspects of technological fixes. Such concerns side-effects.
deserve to refocus the discussion begun by Weinberg
fifty years ago. Ethical Implications
Critical assessments of technological fixes have vari- Early scholarly criticisms of Alvin Weinberg’s notions
ously identified reliance on technological solutions as criticized them as naively confident about the outcomes
evidence for inadequate engineering practice, failures of of science (“scientistic”) and tending to narrowly define
government policy, or outcomes of modern consumer- the complexity of problems (“reductionistic”) [42]. Be -
ism. These concerns suggest that technological fixes cause of its exaggerated attention to measurable out-
have important implications for shared social values, comes, rational decision-making carries additional
the wellbeing of wider publics, and the social role of philosophical and ethical dimensions. This confidence
engineers. In short, technological fixes have cultural, in positivism prioritizes confidence in quantitative evi-
ethical and political dimensions. dence, and necessarily devotes less consideration to
aspects of human values that cannot be counted.
8
Engineering disciplines have adapted to the contemporary environment
of terrorist threats by creating special-interest groups to promote secu-
9
rity technologies and funding for technological fixes. Among them is the International incidents included spillages from the oil tankers Amoco Cadiz
Homeland Security group of SPIE, the optical engineering society, which (1978) and Atlantic Empress (1979). Later incidents, such as the Exxon Val-
aims to “stimulate and focus the optics and photonics technology commu- dez (1989) and Deep Water Horizon (2010), fueled public debate about soci-
nity’s contributions to enhance the safety, counter homeland threats, and etal reliance on large-scale technological systems, ironically while promoting
improve the sense of well being” [38]. technological fixes for avoiding or cleaning up after such accidents.
ISTOCK
I
n this article, I explore the ethical permissibility of autonomous
weapon systems (AWSs), also colloquially known as killer ro-
bots: robotic weapons systems that are able to identify and en-
gage a target without human intervention. I introduce the sub-
ject, highlight key technical issues, and provide necessary
definitions and clarifications in order to limit the scope of the
Ariel Guersenzvaig discussion. I argue for a (preemptive) ban on AWSs anchored in
just war theory and International Humanitarian Law (IHL), which are both
briefly introduced below.
To make my case, I examine and juxtapose a series of arguments and
Digital Object Identifier 10.1109/MTS.2018.2795119
counterarguments in favor of and against AWSs made by several authors
Date of publication: 2 March 2018 from the literature, especially Sharkey [1], and Schmitt and Turner [2]. I will
scientific proxy for an economic detect a 20% improvement over the human fatality
rate of 1.09 fatalities per 100 million miles). A fleet of
conflict between professional drivers 1000 AVs, driving an average of 6 hours per day at
an average of 60 mph, would require 84 years to travel
and robotics manufacturers who are 11 billion miles.
trying to replace them. Uncertainty due to the large amount of data required
is reflected in other analyses. A Google-sponsored
report used data from a National Academies study and
Google to compare human-driven and automated vehi-
of 1.09 fatalities per 100 million [vehicle] miles [trav- cle crash rates [8]. The 95% confidence intervals for the
eled].” Precise estimates of such rare events require AV crash rate estimates are several times wider than
large samples. In the case of human-driven vehicles, those for human-driven vehicles, and for the two high-
these large samples come from trillions of vehicle est-severity crash levels, the AV confidence intervals
miles traveled per year, distributed over tens of mil- fully contain the human-driven confidence intervals
lions of cars. [8, fig. 3, p. 23]. There is simply insufficient observation-
Kalra and Paddock [7] calculate the number of AV al evidence to conclude that AVs are statistically signifi-
miles traveled required to produce statistically precise cantly safer than human-driven cars at conventional
estimates of AV crash rates across several different levels of statistical stringency. And, per [7], this evidence
crash types (from any reported crash to fatalities), will remain insufficient for decades to come.
degrees of statistical stringency or precision, and statis-
tical tasks (estimating the maximum crash rate, demon- Alternatives to Observational Data
strating that the crash rate is lower than a threshold, If we can’t use observational data, then simulations,
and demonstrating that the crash rate is statistically sig- physical test systems, and other broadly “laboratory-
nificantly lower than the human crash rate). Their calcu- based” or “experimental” methods seem to offer a way
lations range across 5 orders of magnitude, from 1.6 million forward. For example, self-driving software might be run
offline several orders of magnitude faster than real-time
driving, and thereby generate billions of “virtual miles
traveled” on a scale of years — or perhaps even days —
Crash Rate per Million
20
gests another testing strategy. In May 2016, a Tesla driv-
15
er was killed when his car when it was operating in
10 “Autopilot” mode and failed to recognize a truck. In
5 response, Tesla proposed a “fleet learning” strategy: A
0 Tesla car’s AV system will track its human drivers’
Level 1 Level 2 Level 3 behavior while the AV system is “off,” comparing the
Crash Severity
human behavior to the behavior that it would take
SHRP 2 Overall based on its sensor data. Pooling these data from all
SHRP 2 PR Tesla vehicles will enable the identification of locations
Self-Driving Car and situations where the AV system tends to be inaccu-
SHRP 2 Overall Age-Adjusted rate and needs improvement [9].
SHRP 2 PR Age-Adjusted This fleet learning proposal could also be used for
safety studies. For example, fleet learning data could
FIGURE 1. Crash rate estimates for human and AV system be filtered for three kinds of scenarios: a crash
drivers. Bars give crash rate estimates with whiskers indicating occurred under AV system control; a crash occurred
95% confidence intervals. Blue and orange bars are different under human control; and a human reclaimed control
classes of estimates for human drivers; green bars are from the AV system and immediately began rapidly
estimates for AVs. Level 1 crash severity is the most severe braking or changing direction. The first kind of scenario
and level 3 is the least severe. Note that, for levels 1 and 2, the
confidence intervals for humans are entirely contained within gives us direct data on AV crash rates. The second and
the confidence intervals for AVs. This is due to limited data for third scenarios let us compare human behavior with
AVs [8, Fig. 3, 23]. counterfactual AV system behavior. In the third scenario,
S
The Phantom Menace (1999) about the implications, long-term, of the current and
future merging of robotics and Artificial Intelligence (AI)
ingularity, or as some may call it — the into an integrated, autonomous entity.1 It is also to address
specific point in time when advanced the legal aspects of daily, continuous interaction between
technological development, for instance humans and this new entity.
artificial intelligence (AI), leads to the We propose the basic questions underpinning this arti-
creation of machines that are smarter cle to be part of a platform for further discussions on the
than human beings — will arrive rough- future legal aspects of human- Robotics/AI Legal Entity
ly around the year 2045 according to (RAiLE©) interaction in key areas of our lives. Initially we
author, computer scientist, inventor and futurist Ray look into aspects affecting the workplace and family.
Kurzweil. [1] The timeline presented by Kurzweil, can be Today we see increased coverage in the global press
argued, but not the fact that we are dealing with rapid about the fascinating developments in the areas of robot-
advances in the fields of robotics and AI. ics and AI, not least in the areas of autonomous vehicles
This article, as part of our early research, is a first and military applications. An important component of
step towards establishing an internationally viable, neu- the concern presented is the very important, and to
tral, consistent legal nomenclature for a specific type of some quite frightening, situation when robotics and AI
technology and its use. Due to the complexity of interac- are combined into one entity. Most of these articles how-
tion between different legal systems and cultures, our ever, examine specific academic, ethical, technical, eco-
early research will focus on soft law. Soft law includes nomic, political, social, or legal impacts, focusing only on
non-enforceable guidelines, policy declarations, or one or two areas of perceived technological disruption [2]
codes of conduct that set standards, often established brought into human society through advanced develop-
by treaty. Many of these are Free Trade Agreements, ment of robotics, with or without AI. We strongly argue
where a lack of technical or legal definitions is a barrier that it is, in particular, the more or less autonomous,
combined robotics/AI entity, that will cause legal issues
in the future.
But what about legal disruption when legislation is not
yet enacted to address the societal impact of new tech-
nology? Facing this issue, we combine our global busi-
ness and legal experience with an experience-based
understanding of information technology (IT) develop-
ment, and see a growing need to look at current legisla-
tion governing the future interaction between humans
and both current and future autonomous robotics/AI enti-
ties. We see today a lack of consistent legal definitions
and related legislation to adequately handle these future
entities, entities that are not created to fulfil only one spe-
cific role but are capable of multiple different, more inte-
grated, roles in our lives and society on a daily basis.
Our question is: “How can we create the necessary
definitions and parameters for a future, global, legisla-
tive framework on human-RAiLE interaction?”
Helpful Definitions
We be g i n by def i n i ng some i mp or t a nt and rele-
vant terminology.
Human-Robot Interaction (HRI): This is the rela-
tionship between human and robot (machine). The term
1
This is not necessarily limited to a 1-on-1 relationship between AI and
robotics, but could apply to 1-M, M-1, or limited M-M. Some form of limita-
tion on the extent of the new legal entity will have to be defined, or any leg-
islation may become applicable to the whole network of units connected.
FIGURE 1. Photograph of the water sampling UAV in the field (not shown to participants).
A Drone
by Any Other Name
P
rojections indicate that, as an indus- facilitating or hindering technology acceptance and
try, unmanned aerial vehicles (UAVs, uptake [2].
commonly known as drones) could To advance understanding of U.S. public perceptions
bring more than 100 000 jobs and of UAV technologies, we conducted a nationwide survey
$80 billion in economic growth to of a convenience sample of 877 Americans recruited
the U.S. by 2025 [1]. However, these from Amazon’s pool of Mechanical Turk (MTurk) work-
promising projections do not account ers. In our surveys, we used short scenarios to experi-
for how various publics may perceive such technolo- mentally vary UAV characteristics, the end-users of the
gies. Understanding public perceptions is important technology, and certain communication factors (termi-
because the attitudes of different groups can have nology and framing). This allowed us to investigate the
large effects on the trajectory of a technology, strongly impacts of these factors alone and in combination.
Manipulated Variables
Purposes
Security use −0.30*** −0.27*** −0.29*** 8.41%
Environmental use 0.24*** 0.31*** 0.27*** 7.29%
Economic use 0.06 −0.04 0.03 0.09%
End-User
Business (vs. Government) −0.11** −0.03 −0.08* 0.64%
Autonomy
Autonomous −0.03 0.09 0.01 0.01%
Manual −0.04 −0.07 −0.05 0.25%
Partially Autonomous 0.07 −0.02 0.04 0.16%
Terminology
UAS term −0.03 −0.09 −0.05 0.25%
UAV term 0.00 0.02 0.01 0.01%
Aerial robot term 0.05 0.01 0.03 0.09%
Drone term −0.02 0.06 0.01 0.01%
Framing
Promotion (vs. prevention) −0.12** −0.03 −0.09* 0.81%
Measured Variables
Female −0.03 −0.02 −0.02 0.04%
Age −0.01 −0.08 −0.03 0.09%
Ideology (Conservativism) −0.05 0.07 −0.01 0.01%
Issue attitude 0.17*** 0.24*** 0.20*** 4.00%
Perceptions of end-users
Trustworthiness 0.50*** 0.55*** 0.52*** 27.04%
Dis-trustworthiness −0.42*** −0.48*** −0.44*** 19.36%
Competence 0.39*** 0.47*** 0.42*** 17.64%
Notes: 2014 N = 576, 2015 N=301. +p<0.10, *p<0.05, **p<0.01, ***p<0.001. %Var is the square of the Pearson correlation across both years and estimates
the variance shared by the predictor and UAV support.
uses, indicates support was lowest for security purposes more negatively skewed in 2015, resulting in the highest
and highest for environmental purposes. average levels of support for environmental purposes.
Figure 3 illustrates the distribution of responses to Among the other variables listed in Table 3, there
questions assessing public support or resistance by UAV were weaker but significant relationships with UAV sup-
purpose and year. There was a relatively bi-modal distri- port favoring prevention-focused framing and end use
bution of support ratings for security purposes in both by the government over private business. Among the
2014 and 2015, indicating public polarization. Ratings non-experimentally varied variables, as expected, there
of support for economic purposes were negatively were relatively strong effects of end-user trustworthi-
skewed, resulting in more support on average than for ness and of issue attitudes relevant to the scenario
security purposes. However, there appeared to be assigned to the participant.
increasing polarization of responses in 2015 relative to Multiple regression procedures provide a different
2014. That is, the percentage of those strongly resisting measure of the importance of variables for predicting
use of UAVs for economic purposes was greater than public support by identifying variables that account for
those expressing more moderate resistance in 2015. independent variance above and beyond other vari-
Finally, support ratings for use of UAVs for environmen- ables. We first tested whether the effects of each of our
tal purposes were negatively skewed in 2014, and even variables depended on time (this is done by testing for
20
20
Frequency (%)
Frequency (%)
15
15
10 10
5 5
Resist: 48% Resist: 51%
Support: 39% Support: 39%
0 0
1.00 2.00 3.00 4.00 5.00 6.00 7.00 1.00 2.00 3.00 4.00 5.00 6.00 7.00
Average Support and Approval for Technology Average Support and Approval for Technology
Economic Purpose
30 30
Frequency (%)
Frequency (%)
20 20
10 10
Resist: 28% Resist: 33%
Support: 60% Support: 54%
0 0
1.00 2.00 3.00 4.00 5.00 6.00 7.00 1.00 2.00 3.00 4.00 5.00 6.00 7.00
Average Support and Approval for Technology Average Support and Approval for Technology
Environmental Purpose
30 30
Frequency (%)
Frequency (%)
20 20
10 10
Resist: 17% Resist: 16%
Support: 68% Support: 71%
0 0
1.00 2.00 3.00 4.00 5.00 6.00 7.00 1.00 2.00 3.00 4.00 5.00 6.00 7.00
Average Support and Approval for Technology Average Support and Approval for Technology
Note: Support was assessed by averaging two items (see Table 2) resulting in a mean between 1 and 7.
Percentages of resistors and supporters sum to less than 100 because a small percentage of persons’ mean
scores were at exactly “4” (neutral) and thus were not counted as resistors or supporters.
FIGURE 3. Distribution of rated support or resistance for the development and use of UAVs by purpose and year.
statistical interactions with time). These analyses indi- interpret. Table 4 shows the Step 1 and 2 models’
cated that the overall main effects did not change effects, which we next discuss in relation to our
between 2014 and 2015. We therefore ignore the effect research questions and hypotheses.
of time in most of our remaining analyses. Next, we
examined a regression model in which the experimen- Response to RQ1: U.S. Public Support is Impacted
tally varied factors were entered on Step 1 and the (Slightly) by Framing but not by Terminology
measured variables were entered on Step 2. This Table 4 provides evidence supporting our hypothesis
allows us to see how important each variable is when it (H1) that terminology will have no impact on public
is competing with different combinations of other vari- support in the U.S., but framing will have a significant
ables. Note that we standardized the measured predic- impact favoring prevention framing. Consistent with
tor variables so that they would have a mean of zero prior research in social psychology, prevention framing
(representing the average response) and a standard in terms of protecting people from harm was associat-
deviation of 1, in order to make results easier to ed with slightly more support (predicting a 0.23 point
Step 2 a
Trustworthiness 0.52*** 0.063 4.69%
Distrustworthiness –0.16* 0.064 0.45%
Competence 0.27*** 0.059 1.53%
Issue attitude 0.12* 0.046 0.46%
Ideology –0.01 0.044 0.00%
increase in support on the 7-point scale) compared to UAVs than partially autonomous UAVs in Step 1 of the
promotion framing. However, the overall variance model. To examine whether the effect of autonomy
accounted for by prevention or promotion framing depends on the purpose of the UAVs, we conducted
(beyond that accounted for by other variables) was another regression analysis (not shown in Table 4) that
small (independently only accounting for less than one- tested for the interaction between the purpose and auton-
half-percent of the total variance in support for UAVs). omy variables. The interaction was not statistically signifi-
Note that, although terminology did not impact sup- cant, which indicates that our hypothesis (H2) that
port for the technology, it did impact familiarity. A total autonomy will have different effects on support depend-
of 92% of respondents across both years indicated ing on purpose, was not supported. Autonomy did not
“yes” they had heard of drones. Only 59% indicated they affect our respondents’ reported levels of support for
had heard of UAVs, 37% had heard of UASs, and 33% UAVs, regardless of the purpose of the UAVs.
had heard of aerial robots. These results were similar
across both years of the survey. Response to RQ3: Purpose and End-User Trustworthi-
ness are the Most Important Predictors of Support
Response to RQ2: UAV Autonomy did not The results presented in Table 4 further confirm the
Impact UAV Support, Regardless of Purpose importance of UAV purpose for impacting support, as pur-
Table 4 results indicate that, as a main effect, autonomy pose accounts for about 13% (11+2%) of the independent
of the UAVs does not appear to affect public support, variance in the Step 1 model, while end user and framing
although there was slightly less support for fully manual each independently account for vastly less — only about
2014 2015
5.70 (a)
5.00 (a)
5.46
4.76
3.87
3.86
3.06
2.90
Notes: Bars representing the conditions under which there occurred significant relationships between ideology
and support are labeled. (a) Ideology-support correlation but not the ideology regression coefficient was significant in 2015.
FIGURE 4. Predicted UAV support by year, UAV purpose, UAV end-user, and ideology (computed at –1 and +1 standard deviation
from the sample mean ideology).
p = 0.046) indicating that the strength of the ideology-support relationship varied by end-user, purpose, and year of the survey. Corr=Pearson correlation.
%var for corr indicates the total variance shared by self-reported ideology and support. %Var for B indicates the independent variance accounted for based on
regression results.
F
ago, Luther nailed
ninety-five theses to
the door of Castle
Church in Wittenberg.
His scholarly objection to certain prac-
tices of the church incited profound
Doors and
and persisting societal change. In the
16th century, church doors were a
mode of publication where academ-
Disputations
ics posted propositions in Latin, thus
inviting debate. Eventually, postings
were no longer written in Latin, but psychological disposition of a child arly propositions, and derive great
rather in the vernacular of people to with autism, resulting in richer inter- benefit from healthy disputation.
better reach society [1], [2]. actions that improve lives [5].
Similar to this custom of long ago, Doors are not only physical portals, Author Information
our authors nail rich scholarship to but also symbols of transitions. Our Christine Perakslis is Associate Pro-
our portal, thus inviting healthy dis- community experiences transition fessor in the MBA Program, College
putation. In this issue, we considered as the torch of editorship is passed. of Management, Johnson & Wales
the value of a mesh of connective We are so grateful for the commend- University, Providence, RI. Email:
vehicles used to overcome the digi- able work of our past “keeper of the christine.perakslis@jwu.edu.
tal divide [3], yet also recognize the threshold.” She gave voice to diverse
dangers of subscribing to techno- stakeholders. She steered us across References
logical fix as a social cure-all. We the thresholds of a plethora of indus- [1] E. Metaxas, Martin Luther: The Man
Who Rediscovered God and Changed the
recognized the benefits of UAVs and tries to mine out intended and unin- World. New York, NY: Penguin, 2017.
military robotics, yet also wrestled tended consequences of current and [2] D. Jütte, The Strait Gate: Thresholds
with tensions between autonomous emerging technologies. She guided and Power in Western History. New Haven,
CT: Yale Univ. Press, 2015.
weapon systems and jus in bello, our foci as we trekked through time [3] “A mobile network to ease your com-
thereby questioning war practices to learn from the past, and to conjec- mute: Portugal’s roving hotspots, Jul.12,
when weighed upon the scales of just ture the future. Our publication has 2017; https://www.wired.com/brandlab
/2017/07/rovinghotspots/.
and fair conduct. also become more applicable and [4] A. Sciutti et al., “Measuring human-robot
Our authors presented method- accessible to a general audience. interaction through motor resonance,” Int. J.
ologies to reform established prac- These rich efforts resulted in mean- Social Robotics, vol. 4, no. 3, pp. 223-234, 2012.
[5] P. Esteban et al., “How to build a sup-
tices. We applaud our colleagues ingful debate and deeper under- ervised autonomous system for robot-
as robots are designed to better standing of the complex interactions enhanced therapy for children with autism
simulate the biological and cogni- between technology and society spectrum disorder,” J. Behavioral Robot-
ics, Apr. 9, 2017; http://www.dream2020
tive processes of humans [4]. We are across the globe. .eu/wpcontent/uploads/2017/05/Paladyn-
inspired as robots better infer the As the torch is passed, our new Journal-of-Behavioral-Robotics-How-to-
“keeper of the threshold” leads us Build-a-SupervisedAutonomous-System-for-
Robot-Enhanced-Therapy-for-Children-with-
Digital Object Identifier 10.1109/MTS.2018.2795122
onward. Under his gifted leadership, Autism-Spectrum-Disorder.pdf.
Date of publication: 2 March 2018 we will continue to post rich schol-
Ways to Contribute
• Donate to SSIT online at https://ieeefoundation.org/ieee_ssit
• You can make a gift to SSIT in honor or memory of someone who has touched
your life
Donations to SSIT are managed by the IEEE Foundation, the philanthropic arm of IEEE. IEEE and the IEEE Founda-
tion are U.S. 501(c)3 non-profit organizations. For more information contact: donate@ieee.org or +1 732 465 5871.
www.TechnologyandSociety.org