Automation Bias in Intelligent Time Critical Decision

Speaker: Missy Cummings , Humans and Automation Lab, MIT

Date: Friday, October 01, 2004

Time: 1:30 PM to 2:30 PM

Refreshments: 3:15 PM

Public: Yes

Location: Patil Seminar Room (32-G449)

Event Type:

Room Description:

Host: Jaime Teevan, CSAIL

Contact: Jaime Teevan, 617/253-1611, teevan@mit.edu

Relevant URL:

Speaker URL: None

Speaker Photo:
None

Reminders to: hci-seminar@csail.mit.edu

Reminder Subject: TALK: TBA

Abstract:

Various levels of automation can be introduced by intelligent decision support systems, from fully automated, where the operator is completely left out of the decision process, to minimal levels of automation, where the automation only makes recommendations and the operator has the final say. For rigid tasks that require no flexibility in decision-making and with a low probability of system failure, higher levels of automation often provide the best solution. However, in time critical environments with many external and changing constraints such as air traffic control and military command and control operations, higher levels of automation are not advisable because of the risks and the complexity of both the system and the inability of the automated decision aid to be perfectly reliable. Human-in-the-loop designs, which employ automation for redundant, manual, and monotonous tasks and allow operators active participation, provide not only safety benefits, but also allow a human operator and a system to respond more flexibly to uncertain and unexpected events. However, there can be measurable costs to human performance when automation is used, such as loss of situational awareness, complacency, skill degradation, and automation bias. This talk will discuss the influence of automation bias in intelligent decision support systems, particularly those in aviation domains. Automation bias occurs in decision-making because humans have a tendency to disregard or not search for contradictory information in light of a computer-generated solution that is accepted as correct and can be exacerbated in time critical domains. Automated decision aids are designed to reduce human error but actually can cause new errors in the operation of a system if not designed with human cognitive limitations in mind.

Bio:

Mary (Missy) Cummings received her B.S. in Mathematics from the United States Naval Academy in 1988, her M.S. in Space Systems Engineering from the Naval Postgraduate School in 1994, and her Ph.D. in Systems Engineering from the University of Virginia in 2003. A naval officer and military pilot from 1988-1999, she was one of the Navy's first female fighter pilots. She is currently the Boeing Assistant Professor in the Aeronautics & Astronautics Department at the Massachusetts Institute of Technology. Her previous teaching experience includes instructing for the U.S. Navy at Pennsylvania State University and as an assistant professor for the Virginia Tech Engineering Fundamentals Division. Her research interests include human supervisory control, collaborative human-computer decision making, decision support, information complexity in displays, and the ethical and social impact of technology.

Research Areas:

Impact Areas:

See other events that are part of the HCI Seminar Series Fall 2004.

Created by Linda L. Julien Email at Wednesday, June 19, 2013 at 6:21 AM.