A new study has found one of the challenges in designing systems that involve people interacting with technology is to tackle the human trait of overconfidence.
The study, published in the journal IEEE Control Systems, takes a novel multidisciplined approach in studying “cyberphysical human systems”. The research considers the relationship between people and computer systems both from the perspective of control system engineering and behavioural economics.
The research by QUT’s Cyberphysical Systems Professor Daniel Quevedo, and Marius Protte and Professor René Fahr, both from Paderborn University in Germany, looks at the impact that human decision can make on an engineered system.
Professor Quevedo said control system engineers generally did not examine the interaction between people and the systems they were in, and how their choices could impact on the system.
To explain how unpredictable human decisions could impact on a controlled system, Professor Quevedo said an example was if he was planning a drive using a navigation system and was offered alternative routes.
“I make my own decision based on the information and drive. And that affects the whole traffic system,” Professor Quevedo said.
“There is this problem about what information does the car system give me so that I behave in one way or another.
“That’s just for one car. With traffic, there are many cars. What information should we get so that we behave in one way or another? How do our actions work?”
While the system’s designer expects humans to take the fastest route, they might take a different route. If enough people decided to take an alternative route, then the traffic flow predictions of the system would need to be reconsidered.
Professor Quevedo said successful design of “human-in-the-loop” control systems required an understanding of how humans behaved.
He said an interesting issue was that people, unlike machines, did not necessarily improve their performance through immediate and frequent feedback.
“Given the immense complexity of human behaviour, there’s no clear way to create appropriate models for human decision making,” Professor Quevedo said.
In the study, the researchers looked at how people behaved when given the task of piloting a drone and found that frequent feedback about the quality of the piloting decisions made, may lead to poor performance.
“While more information is commonly considered to result in better decisions, human susceptibility for perceptual biases in response to high information supply must be considered,” Professor Quevedo said.
“Otherwise, individuals might take unnecessarily high risks, rendering thoughtfully designed policies inefficient.
The study highlights that people often overestimate their ability at a task, such as believing they are better than average drivers, or they succumb to the “hot hand fallacy" from basketball which links the likelihood that a player will score in the future to his past successes in throwing.
“If you win you think you’re doing really well, you fall in love with yourself,” Professor Quevedo said.
“As a control engineer, I always tended to assume that cooperative people somehow just do what they’re told because they’re part of a system.
“We need to incorporate a model of human behaviour, but human behaviour is a difficult thing.
“You don’t want to overload people with information because they can’t process all of it. But it’s much more refined than that.”
This multidisciplinary study of human behaviour through behavioural economics and control system engineering is a start for future research.
“Putting the worlds together is the first step for us. Now we want to continue,” Professor Quevedo said.
“The current work exposes the human as an under observed source of errors in human-in-the-loop control systems.
“Future areas of research need to be how to design mechanisms on when to pass on information and how to pass on information to human decision makers.”
Media contact:
Rod Chester, QUT Media, 07 3138 9449, rod.chester@qut.edu.au
After hours: Rose Trapnell, 0407 585 901, media@qut.edu.au