“The Robot made me do it”: Robots encourage risk-taking behaviour in humans : University of Southampton Study

A SoftBank Robotics Pepper robot was used in the two robot
conditions

New research has shown robots can encourage humans to take
greater risks in a simulated gambling scenario than they would if
there was nothing to influence their behaviours. Increasing our
understanding of whether robots can affect risk-taking could have
clear ethical, practiCal and policy implications, which this study
set out to explore.

Dr Yaniv Hanoch, Associate Professor in Risk Management at the
University of Southampton who led the study explained, “We know
that peer pressure can lead to higher risk-taking behaviour. With
the ever-increasing scale of interaction between humans and
technology, both online and physically, it is crucial that we
understand more about whether machines can have a similar
impact.”

This new research, published in the journal Cyberpsychology,
Behavior, and Social Networking, involved 180 undergraduate
students taking the Balloon Analogue Risk Task(BART), a computer
assessment that asks participants to press the spacebar on a
keyboard to inflate a balloon displayed on the screen. With each
press of the spacebar, the balloon inflates slightly, and 1 penny
is added to the player’s “temporary money bank”. The balloons
can explode randomly, meaning the player loses any money they have
won for that balloon and they have the option to “cash-in”
before this happens and move on to the next balloon.

One-third of the participants took the test in a room on their
own (the control group), one third took the test alongside a robot
that only provided them with the instructions but was silent the
rest of the time and the final, the experimental group, took the
test with the robot providing instruction as well as speaking
encouraging statements such as “why did you stop pumping?”

The results showed that the group who were encouraged by the
robot took more risks, blowing up their balloons significantly more
frequently than those in the other groups did. They also earned
more money overall.  There was no significant difference in the
behaviours of the students accompanied by the silent robot and
those with no robot.

Dr Hanoch said: “We saw participants in the control condition
scale back their risk-taking behaviour following a balloon
explosion, whereas those in the experimental condition continued to
take as much risk as before. So, receiving direct encouragement
from a risk-promoting robot seemed to override participants’
direct experiences and instincts.”

The researcher now believe that further studies are needed to
see whether similar results would emerge from human interaction
with other artificial intelligence (AI) systems, such as digital
assistants or on-screen avatars.

Dr Hanoch concluded, “With the wide spread of AI technology
and its interactions with humans, this is an area that needs urgent
attention from the research community.”

“On the one hand, our results might raise alarms about the
prospect of robots causing harm by increasing risky behavior. On
the other hand, our data points to the possibility of using robots,
and AI, in preventive programs such as anti-smoking campaigns in
schools, and with hard to reach populations, such as
addicts.”

Originally published by
University of
Southampton
| December 11, 2020

 


Original article

 

A SoftBank Robotics Pepper robot was used in the two robot
conditions. Pepper, 1.21-meter-tall with 25 degrees of freedom, is
a medium-sized humanoid robot designed primarily for Human-Robot
Interaction (HRI).