It's easy to cleanse humanity in the workplace. In business, people are talking about serious topics like profit and strategic planning, so it's assumed that everyone is driven by data alone and makes rational choices based on evidence. In a business environment, people are considered to be rational actors, with self-control, and always make optimal decisions.
However, behavioral economics conflicts with the above ideas. Behavioral economics argues that even in business, humans are subject to emotions and overflowing impulses. This theory suggests that we are nothing more than working human beings, and that our surroundings tend to influence and guide us to make irrational decisions. That is, even with all the data at hand, we don't always follow the economic model or, in short, we don't always do what we "should".
There are three ways in which behavioral economics affects security
There will always be a human factor behind people's decisions, but in business, emotions are often overshadowed by big data. However, no matter how much we resist, the emotions remain: even if we do our best, human beings are not to be biased and biased.
Security is also an area that is heavily influenced by behavioral economics. Since cybersecurity is a high-pressure area that requires continuous incident management, if security professionals are not careful, behavioral economics theories can hinder security programs and derail risk management.
Psychological Accounting
Psychological accounting is an important concept in behavioral economics that refers to the process by which people psychologically code, classify, and value outcomes, especially economic outcomes, believing that an individual's perception of money varies depending on the environment. When people assign different values to money based on their circumstances or the framework of the topic, they make irrational decisions.
Psychological accounting can affect cybersecurity, as it is difficult to secure a budget for risks that have not yet been revealed. Even if you desperately need the funds to purchase an incident response retainer, other supervisors' mental accountants may overlook this need because the risk neither arises nor lasts.
Psychological accounting may have finance or other executives asking, "Why pay for something that might not have happened?""Cybersecurity leaders know that protecting the business is all about planning, and not buying the right tools can cause pain in the event of a data breach or security incident. As the IBM report reveals, the average cost of a data breach is $4.45 million. As a result, security leaders must efficiently determine budget needs to protect the business and ensure that sufficient funding is available for data breach response.
Sunk Cost Fallacy
If cybersecurity professionals are too obsessed with a security roadmap rather than improvising, they risk a sunk cost fallacy. To explain it is that people continue to invest in projects that are doomed to fail because they have invested a lot of time or resources. If you have a multi-year security roadmap in place, it's easy to cling to it because you're loathing about loss.
While it's important to implement the roadmap from a security perspective, it's also important to accept a change in route or initial goal. Researchers at the University of Maryland found that hackers make a cyberattack attempt every 39 seconds: a clear indication that security methods and plans must adapt and adapt with such a high frequency of attacks. Security leaders can't get too attached to the initial roadmap and don't get too attached to reprioritizing emerging threats.
Availability inspired
The availability heuristic theory holds that when evaluating a particular situation or outcome, people tend to rely on information that they recall quickly rather than data.
The success of social engineering has amply demonstrated this theory. Employees act quickly and often without thinking. When a link that "looks" safe, a distracted or overworked employee may not immediately be able to tell that the link is suspicious. Many phishing attempts appear legitimate, and if employees inadvertently rely on availability heuristics to recall recent information, they may not be able to identify the fraudulent attempt.
People always want to cut corners, and even with all the data available, there's no guarantee that they'll be combing through it extensively to make the "right" decision. No matter how well trained you are in social engineering, heavy work and hectic schedules dictate that reliance on availability inspiration is inevitable. It also means that anyone can easily fall victim to phishing.
Behavioral economics in the field of cybersecurity
It's clear that in the world of cybersecurity, behavioral economics is inescapable: no matter how much data is at your disposal, your innate humanity still has an impact. Security professionals are not only influenced by the behavioral economics of other sectors, but are also at risk of becoming victims. Fortunately, recognizing that humans are not always well-calculated robots can help limit the negative effects of behavioral economics.
Understanding how emotions affect work can enable cybersecurity professionals to move security forward more effectively. Understanding availability heuristics, sunk cost fallacies, and psychological accounting can help better define security decisions as profitable. The more we understand behavioral economics, the more effectively we can view safety-related investments and decisions as a profitable win.
the end 】—