It is well established that human error plays a significant role in enabling organization-threatening information security incidents. One study conducted with participation from more than 5,000 organizations worldwide indicates that organizations are becoming more aware of the employee’s role in information security incidents. Fifty-two percent of organizations surveyed for the study said employees were their greatest weakness in IT security, with their actions putting business and organizational information security strategies at risk.1
The standard response to human error is to increase cybersecurity awareness and training, often with mixed results. KnowBe4’s 2021 State of Privacy and Security Awareness Report notes that a large percentage of surveyed employees did not feel confident that they could identify a social engineering attack, recognize the warning signs that their computers were infected with malware, or describe to senior management the security risk associated with employees working from home.2 But what if people alone are not to blame? Poorly designed interfaces, systems, processes and decision-making criteria can all play a major role in human error.
Human factors engineering, which has been successfully implemented in wide-ranging industries such as aviation and healthcare, has important lessons for the field of information security. Human factors engineering uses scientific knowledge about human behavior to specify the design and use of a human-machine system.3 The aim is to improve system efficiency by minimizing human error. The field of human factors engineering is not new. In fact, as far back as 1965, Alphonse Chapanis, a pioneer in the field of industrial design and human factors engineering, described human factors engineering as research into the design of machines, processes and the environments in which people function so that they complement human capacities and limitations.4 Human factors engineering draws considerable inspiration from cognitive psychology, which is the study of human mental performance, memory and attention. It also borrows ideas from many related disciplines, including anthropometrics, anatomy and physiology.5
Cybersecurity practitioners and technology leaders often fall into the trap of creating products based on sound engineering principles that are effective and efficient but fail to consider the end users—namely, the frontline laypeople who use their products on a daily basis. If the design of the product leads to emotions of ambiguity and confusion in the minds of those using it, they are more likely to experience feelings of stress, leading to errors and irrational decision-making. Incorporating human factors engineering early into the design process of systems and processes will lead to fewer errors in the use of the product in the field; it also reduces the need to depend on what is often unreliable user training to compensate for deficiencies in the product’s human–machine interface.
Diverse industries have achieved significant benefits by reducing human error and stress through well-thought-out human–machine interfaces and design principles that keep the human front and center.
The Effect of Human Factors
Considerable evidence-based research is available to engineers and cybersecurity practitioners to convince them that it is worth going beyond just creating a working, efficient system and that investing in human factors will lead to fewer cybersecurity incidents. Diverse industries have achieved significant benefits by reducing human error and stress through well-thought-out human–machine interfaces and design principles that keep the human front and center.
Aviation
A staggering 70 to 80 percent of safety incidents in civil and military aviation are attributable to human error.6 To err is human, and studies of the consequences of errors and the extent to which their effects can be reversed or mitigated have led to improvements in aviation safety, from simple design changes such as putting protective covers on airplane flight deck switches to prevent inadvertent activation to more complex avionics interface design.7
Chapanis studied airplane crashes that occurred during landing in certain types of World War II airplanes. The crashes occurred because of inadvertent retracting of the wheels instead of the flaps during landing and were classified as pilot error.8 Curious as to why only certain types of aircraft and not others encountered such problems, Chapanis discovered that in the problematic planes, the controls for the wheels and flaps operations were placed right next to each another; whereas on safer aircraft, they were placed farther apart. Improvements made based on that insight led to the development of unique rubber fittings over the controls to distinguish them from one another, a practice still widespread in modern aircraft.
Healthcare
In a case study of the application of human factors in healthcare settings, researchers identified the potentially life-threatening consequences of patient-monitoring devices that fell short of the high standards to which human-machine interfaces must adhere in the healthcare field to reduce errors and save lives.9 Personnel in charge of transporting patients in and out of intensive care unit (ICU) rooms were misled by erroneous readings from a device that monitors patients during transport. The device, often referred to as a transport monitor, led personnel into believing critically ill patients were stable with respect to important parameters such as heart rate and blood pressure when they were actually well outside the tolerances for safety. A closer look revealed that the transport monitor device was set to what the device manufactures called demo mode, a state used to demonstrate the device’s capabilities by showing simulated numbers instead of actual readings from patients. The researchers concluded that the small D in a corner of the device screen signifying it was in demo mode was insufficient to get the attention of the people using the device.
With lives on the line, a human factors analysis of the device and its operation in the healthcare environment led to suggestions for improvement. These included having the device automatically exit demo mode after a short period of time and following the example of space and aviation systems by having it display a large X across the background when it is showing simulation data. Rather than play the blame game of chastising whomever set the device in demo mode or failed to take it out of the mode, the case study chose to highlight that, armed with knowledge of human factors engineering, even simple changes to human-machine interfaces can reduce the likelihood of human errors.
Automotives
A series of studies conducted in the early 1990s focused on the alarming statistics surrounding rear-end automobile collisions, which, at the time, accounted for approximately 20 percent of all accidents in the United States.10 Driver errors in detecting and interpreting the information presented by the vehicle in front of them often led to rear-end collisions that prompted the eventual development of the center high-mounted stop lamp (CHMSL). The US National Highway Traffic Safety Administration eventually concluded that cars with CHMSLs had fewer rear-end collisions, thus making it standard equipment for cars in many countries.11 It was believed that CHMSLs, being distinct and separate from the tail and turn lights—in addition to forming
part of a triangular pattern with the tail brake lights
that improved visibility—were integral to reductions in
driver errors.12
Organizations are coming to the sobering realization that awareness training is never going to be enough to reduce human errors.
Applying Human Factors in Cybersecurity
Current cybersecurity thinking is often so focused on humans being the weakest link in the chain that it ignores the science explaining why humans make mistakes in the first place. For example, in 1997, a young engineer dealing with freezing conditions late at night in the datacenter of a small organization in Amarillo, Texas, USA, turned off the air conditioning and forgot to turn it back on, leading to hardware damage to several computers. Most organizations now employ locked thermostat covers to reduce this type of error. The locks are mainly a deterrent rather than preventive in nature.
Incorporating human factors engineering into designing processes and systems could provide a better return on investment (ROI) than awareness training alone.
Closing the Loop
For the Cyber Claims Analysis Report, researchers analyzed data from 1,200 data breach claims in nearly 50 countries from 2013 through 2019 and identified human errors such as employees clicking on links in phishing emails or replying to spoofed emails as the most common root causes of breaches.13 So, why not throw humans a lifeline to make it easier for them to prevent errors?
There are two distinct approaches that incorporate some level of human factors engineering to make it easier for humans to verify genuine emails and reduce the chances of error.
The first approach closes the loop on emails by tying the content of the email back to a verification mechanism on the well-known corporate website (figure 1).
The second, related approach is suited for public communication that is not sensitive in nature (figure 2). This approach mirrors the content of the email on the organization’s well-known public website. The email offers a verification mechanism by directing the recipient, without using a link, to visit a well-known enterprise website where the contents of the email match the latest announcement in the website’s news/announcement section (figure 3).
Usability
Jakob Nielsen, cofounder of the human-machine interface research enterprise the Nielsen Norman Group, maintained that system design should focus on usability not only to reduce errors, but also to provide a better end-user experience.14 A good system should utilize:
- Learnability
- Efficiency
- Memorability
- Error recovery
- Satisfaction
Anecdotally, an insurance enterprise was surprised by a 500 percent increase in support tickets after a major product release. Upon further research, it was discovered that the product release had changed the username format of customers from their email addresses to randomly generated usernames (figure 4).
People typically have no trouble remembering their email address but struggle to remember a randomly generated username. It was, therefore, no surprise that the change generated more support requests.
Antipatterns
Usability antipatterns are a continuing distraction for the layperson desperately looking to apply lessons learned from awareness training. There are several examples of emails with which the average employee might struggle. During cybersecurity training, employees are often taught to hover over links to ensure that they are pointing to familiar websites. When links in emails are made up of random characters instead of the domain name of the organization that sent them, the recipient of the email stands little chance of knowing if the links point to the website of an organization they trust (figures 5 and 6).
Another example involves emails that claim to be encrypted and secure but provide very little context other than an HTML file attachment, a file type not normally used in business correspondence. To make matters worse, the user is instructed to open the attachment to learn more about it. These are legitimate emails from email encryption service providers with very little thought put into how humans—trained to be cautious about links and attachments—might perceive them (figure 7).
To counter these antipatterns, introduce measures of usability into the system and, hopefully, reduce errors, some organizations give end users a way to report suspicious emails using a phish alert button. The feedback received by the end users from the information security teams about each email not only gives rank-and-file employees a sense of participation in the organization’s defense, but also serves as a way to reinforce awareness training (figure 8).
Data Classification by Default
Another anecdote centers on an organization that experienced a situation in which data owners, stewards, custodians and administrators constantly struggled to reduce data risk for common network files. The organization created a simple tweak that made it easier for employees to reduce errors and in the process balance reasonable security and usability.
Rather than controlling network file shares with permissions that required employee participation for every data-securing transaction and time-consuming administration, the organization split the network shares into predefined sections with preset security for each department (figure 9).
The result was a scheme wherein end users instinctively knew where to drag their files and what sort of exposure the files would get by default, with the freedom to apply extra security measures if needed.
Multifactor Authentication Fatigue
Human factors engineering has long recognized that humans tend to get overloaded and make poor decisions when confronted with alarms, warnings and conflicting messages coming from sensors and instruments during an emergency. Bad actors are now using the same principle of sensory overload to overwhelm humans with a repeated stream of multifactor authentication (MFA) prompts, hoping they will approve the prompts to make them stop (figure 10).
MFA prompts are being redesigned to help humans make better decisions and reduce the chances of error while still maintaining usability by requiring users to type in codes and provide more context (e.g., the location from which the sign-in is originating and the service the user is trying to access) (figures 11 and 12).15
Passwords
Microsoft’s announcement that it no longer recommends periodic password expiration in organizations has drawn some attention in the cybersecurity world.16 To those well-versed in human factors engineering, it has been a long time coming. Human factors engineering is the study of people relationship with machines, procedures, and the environments in which they operate.17 It makes sense for organizations to seek to better understand what causes frustration and annoyance in the workplace that can, in turn, lead to mistakes or policy abuses. Some researchers have long believed that forcing people to change passwords frequently leads them to choose weak passwords to begin with, which they make minimally compliant with policy by altering only slightly when required. Recent studies show that this practice may lead to easier compromise of accounts.18
Conclusion
Just like in aviation, where pilots are often blamed when accidents occur,19 blaming the average employee for cybersecurity incidents does little to fix inherent design flaws that fail to take human limitations into account. Organizations are coming to the sobering realization that awareness training is never going to be enough to reduce human errors. Rather than trying to hold the overwhelmed frontline employee accountable for falling for phishing emails or making futile attempts to educate people who struggle with understanding technology, trained cybersecurity professionals should focus more of their attention on designing systems, interfaces, processes and environments that are more intuitive and less confusing. Studying how humans interpret and interact with the systems around them and reducing the chances of errors by making small tweaks to those systems to decrease mental workloads or improve situational awareness should pay dividends for organizations.
Endnotes
1 Kaspersky, “The Human Factor in IT Security: How Employees Are Making Businesses Vulnerable From Within,” www.kaspersky.com/blog/the-human-factor-in-it-security
2 KnowBe4, 2021 State of Privacy and Security Awareness Report, USA, 2021, http://www.knowbe4.com/hubfs/2021-State-of-Privacy-Security-Awareness-Report-Research_EN-US.pdf
3 Adams, J. A.; Human Factors Engineering, Macmillan, USA, 1989
4 Chapanis, A. R.; Man-Machine Engineering, Wadsworth Pub. Co., USA, 1965
5 Gosbee, J.; “Human Factors Engineering and Patient Safety,” Quality and Safety in Health Care, vol. 11, iss. 4, December 2002, http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1758019/
6 O’Hare, D.; M. Wiggins; R. Batt; D. Morrison; “Cognitive Failure Analysis for Aircraft Accident Investigation,” Ergonomics, vol. 37, 1994, http://www.tandfonline.com/doi/abs/10.1080/00140139408964954
7 Abbott, K. H.; “Human Factors Engineering and Flight Deck Design,” The Avionics Handbook, USA, 2001, http://www.helitavia.com/avionics/TheAvionicsHandbook_Cap_9.pdf
8 Lewis, J.; “Human Factors Engineering,” Encyclopedia of Software Engineering, 2011, http://www.researchgate.net/publication/320935039_Human_Factors_Engineering
9 Op cit Gosbee
10 Kahane, C. J.; An Evaluation of Center High-Mounted Stop Lamps Based on 1987 Data, US Department of Transportation, USA, 1989
11 The Free Library, “Rear Light Arrangements for Cars Equipped With a Center High-Mounted Stop Lamp,” 2014, http://www.thefreelibrary.com/Rear+light+arrangements+for+cars+equipped+wit%20h+a+center+high-mounted...-a017404825
12 McKnight, A. J.; D. Shinar; “Brake Reaction Time to Center High-Mounted Stop Lamps on Vans and Trucks,” Human Factors: The Journal of the Human Factors and Ergonomics Society, vol. 34, iss. 2, 1992
13 Willis Towers Watson, Cyber Claims Analysis Report, United Kingdom, 2020, http://www.wtwco.com/en-nz/insights/2020/07/cyber-claims-analysis-report
14 Nielsen, J.; Usability Engineering, Morgan Kaufmann, USA, 1993
15 Weinert, A.; “Defend Your Users From MFA Attacks,” Microsoft, 28 September 2022, http://techcommunity.microsoft.com/t5/microsoft-entra-azure-ad-blog/defend-your-users-from-mfa-fatigue-attacks/ba-p/2365677
16 Microsoft Security Guidance Blog, “Security Baseline (FINAL) for Windows 10 v1903 and Windows Server v1903,” 23 May 2019, http://learn.microsoft.com/en-us/archive/blogs/secguide/security-baseline-final-for-windows-10-v1903-and-windows-server-v1903
17 Hawkins, F. H.; Human Factors in Flight, Routledge, United Kingdom, 1987
18 Cranor, L.; “Time to Rethink Mandatory Password Changes,” Federal Trade Commission, 2 March 2016, USA, http://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2016/03/time-rethink-mandatory-password-changes
19 Endsley, M. R.; “Human Factors and Aviation Safety: Testimony to the United States House of Representatives Hearing on Boeing 737-Max8 Crashes,” Human Factors and Ergonomics Society, 1993
RANJIT BHASKAR
Is an information security officer at Texas Windstorm Insurance Association (TWIA). He has 26 years of experience in enterprise architecture and information security and is the author of several journal articles and op-eds. He can be reached via LinkedIn at http://www.linkedin.com/in/ranjit-bhaskar-467877218/.