Human factors are the psychological, physiological, and environmental properties that are both intrinsic to people and also influence their interaction with the world. Scientific evidence shows that certain human factors — such as fatigue, time of day, distractions, and even visual display formats — affect human performance and impact safety in industries such as aviation, trucking, healthcare, manufacturing, and nuclear power.
The National Transportation Safety Board’s investigative processes consider the human factors that contribute to an accident, beyond mechanical failures. The FAA’s Dirty Dozen lists the 12 common causes of mistakes in the aviation workplace due to human factors.
Safety and security are closely linked; after all, security breaches can provide unauthorized access to safety-critical systems. You’ve probably read of attackers gaining remote access to medical infusion pumps or shutting down automobile safety systems. And software, which is a major component of most safety-critical systems, is notoriously insecure.
What can software engineering learn from human factors research in safety-critical systems?
Where App Security Meets Psychology
I set out to answer this question with my research partners at Secure Decisions and Rochester Institute of Technology. If we could identify the human factors that play a role in software security, then development managers could use that knowledge to reduce the accidental introduction of vulnerabilities in code. And security teams could locate code that was more likely to be vulnerable.
We reviewed the scientific literature and conducted our own research. We looked at factors like team size, level of focused attention, physical separation of developers, time of day when code is written, and hours worked per day and assessed their relationships to vulnerabilities found in the software.
Through the research, we were able to identify a number of human factors that are associated with insecure software. With the move to remote work, it is especially important to understand the human factors that managers need to support, discourage, and monitor to create ideal working conditions for remote teams. Here is a high-level summary of some of our key findings.
Developers Need Focused Attention
Unfocused contribution rises when a developer is modifying multiple files or when the number of unique contributors to a file increase. Unfocused contribution is associated with a greater number of vulnerabilities. This suggests that development managers should think twice before assigning two many separate tasks to a single developer. And developers should situate themselves in an environment that is relatively free from distractions.
Bigger Teams Correlate to Less Secure Code
Larger teams mean more weaknesses and vulnerabilities. It’s hard to say what the ideal team size is. But research shows that Chromium files with 9 or more developers were 68 times more likely to have a vulnerability, and Apache web server files with 9 or more developers were 117 times more likely. So, my advice is to keep development teams relatively small and focused.
Excessive Work Hours Affect Performance
Research-based guidelines in aviation and medicine indicate that people engaged in safety-critical work should not work more than 11 hours per day. It is well-known that human performance degrades significantly as people work long periods of time. We should apply this to software developers and limit their sustained work to no more than 11 hours per day. Software “death marches” should be avoided not applauded.
The Time of Day Code Is Written Matters
Code churned between midnight and 8 AM and noon to 4 PM have files with more vulnerabilities. This maps to our circadian rhythms, which are cyclical changes in our mental alertness and physiological arousal over the course of a day. Most humans’ alertness wanes between midnight and 6 am, and many also sustain reduced alertness around 2 pm. It is prudent for software engineers to not code in the wee hours of the morning, and to take a break in the early afternoon.
Team Location Does Not Influence Code Security
Research conducted by Microsoft found no difference in software security between teams in the same building, cafeteria, campus, locality, or even continent. Distributed teams and co-located teams had essentially the same number of post-release failures. This is good to know as we now live in a remote working environment.
Studying human factors gives us a new way to identify source code that is more likely to contain vulnerabilities based on what we know about the developers and the teams that wrote the code. For example, analysts and developers could choose to first triage static analysis findings or perform code reviews on code that was built by a team of nine developers where most of the code was committed at 2 am in the morning.
Understanding human factors is especially important as we develop new models for remote work. Managers could use human factors research to shape a remote work environment with fewer sustained work hours and fewer concurrent projects that in turn fosters more secure development practices.
Anita D’Amico, PhD is CEO of Code Dx Inc., which provides solutions that automate application security workflows within a DevOps environment. Prior to becoming CEO, she led the development of innovative cybersecurity technologies for over 20 years, initially as the head … View Full Bio
Recommended Reading:
More Insights