According to research from the Centers for Disease Control, each day in the United States eight people are killed and 1,161 injured in motor vehicle crashes related to distracted driving. While cell phones are a common source of distraction among drivers, roadside signage can also draw drivers’ attention away from the road. Directional guide signs as well as commercial advertisements are a considerable external distraction that can cause drivers to take their eyes off the road.
And driver inattention is not only a civilian problem; police officers are also prone to multitasking while driving. Of particular interest to researchers investigating this phenomenon are the in-vehicle mobile computer terminals (MCTs) used by police officers and other emergency vehicle personnel. MCTs consist of a screen and keyboard that can be used to view and enter information, and to transmit data to peripheral devices like two-way radios. Officers are often called upon, through computer-assisted dispatching, to access and respond to on-screen information while driving, especially in emergency situations. But MCTs can also present a significant safety hazard for police officers who have to operate them while simultaneously operating a vehicle.
It is often the very tools that aim to improve the driving experience – and enhance the ability of police to enforce road safety – which present the biggest hazards. It’s a conundrum – and a unique engineering challenge.
In order to design better technologies, engineers first need to understand how drivers respond – whether consciously or subconsciously – to a host of visual stimuli. By identifying those triggers that cause drivers to divert their concentration from the road, engineers can propose new designs that decrease or even eliminate the greatest risk factors.
Human-systems engineering improves usability and safety by considering cognitive factors
Maryam Zahabi, a Ph.D. candidate at North Carolina State University’s Department of Industrial and Systems Engineering, has taken on that very challenge, using both quantitative and subjective data collected during experimentation to inform safety-related design improvements. The goal of the human factors area of research at NC State, Zahabi says, is to improve quality of life. “By designing on-road signs and in-vehicle technologies in a way that does not distract drivers from the task at hand, our research will improve safety for all people.”
Zahabi came to NC State in 2013 from Iran, where she completed both her BS and MS at the prestigious Sharif University of Technology in Tehran. With a background in industrial engineering and ergonomics, Zahabi says she became interested in studying the cognitive human aspects of systems engineering after getting involved in research at NC State. “Coming to the department, I saw very exciting research projects in the human factors area – multitasking, driver distraction and usability in electronic medical records, for example.” Ultimately, Zahabi decided to migrate her own
research into the cognitive domain.
In the long run, research like Zahabi’s aims to help people better navigate the world. “Ultimately,” she says, “the findings of this research will be helpful in improving safety for both civilians and police officers.”
Strategic design of experiments enables researchers to determine causal relationships in multifactor opportunity spaces
In the human factors field of engineering, Zahabi says, data analysis is the most important aspect of research. “We want to make sure that the findings of our research are powerful and can be generalizable. We need to make sure that our proposed enhancements could actually increase performance with statistically significant findings.”
Industrial and systems engineers face complex multifactor opportunity spaces wherein a meticulous, methodical approach to experimentation is crucial to producing usable results. Information gathering in these research environments, however, can be challenging as relationships between inputs and outputs are difficult to spot. To ascertain such linkages, researchers like Zahabi and her colleagues manipulate the data in deliberate, strategic ways. To organize this process, many turn to a strategy often referred to as design of experiments (DOE).
DOE involves the adoption of a pre-specified design that enables researchers to model the ways in which factors independently or jointly affect a response. When she first arrived at NC State, Zahabi recalls, “I didn’t know that I could use a statistical tool to design experiments; I always thought that software was only for data analysis. But after taking a course on DOE, I learned how I could use JMP to produce different designs …. And now I use JMP for designing experiments as well as doing data analysis.”
Zahabi says JMP® affords her the ability to create custom designs around specific questions, quickly and easily.
One software solution for many kinds of statistical analysis
In addition to DOE, Zahabi also uses JMP to run ANOVA, power analyses and correlations. Zahabi and her colleagues work with continuous (e.g., task completion time), binary (e.g., gender), nominal (e.g., physical exertion level) and ordinal (e.g., age range: young, middle, elderly) data, as well as subjective data from literature surveys and Likert scales. JMP, she says, “helps me to do different types of analysis – like parametric and nonparametric analyses and correlation, for example – very easily. And I also use JMP for data screening and the identification of outliers before applying inferential statistic methods.”