Beyond Awareness

It started with the COSO Internal Controls Integrated Framework published in 1992: Control Awareness was pointed out as a key element of an effective control environment. [1] The argument is logically appealing: If employees and managers aren’t aware of internal controls and their importance, how are they supposed to apply them consistently and effectively?

I cannot imagine how many internal audit findings would not have been written if the “control awareness” concept hadn’t been around. Regardless of the observation, a lack of control awareness was a root cause easily added and difficult to challenge. If an auditor didn’t find anything else, she could always construe a need to improve “control awareness” that would be hard to refute by audited management.

Awareness has also been in high demand in many other related fields, basically in any governance, risk, security, quality or compliance area. For instance, IT security professionals have been advocating more IT security awareness [2], and compliance officers look to increase compliance awareness. Or, to sum it up: everybody wants more risk awareness. Meaning that if only management and employees were maintaining at all times a high level of “risk awareness” (i.e. paying attention to all that could possibly go wrong), they would (more likely) take appropriate and effective control measures to deal with all these risks. Or at least that is the inherent assumption.

This primacy of awareness for desirable behavior is essentially founded in rational choice theory, assuming individuals are rational agents taking into account all available information, event probabilities and potential costs and benefits -benefit before making a consistent conscious choice for the best alternative of action.[3]

But looking at the reality of organizations , we don’t have to look far to find that this has never really worked, or has it?

Awareness does not automatically lead to action

The first fallacy is the assumption that awareness will automatically lead to (correct and effective) behavior. This is wrong.

“Between stimulus and response, there is a space. In that space lies our freedom and our power to choose our response.”

– attributed to Viktor Frankl

Although I am not a psychologist, nor a scholar in behavioral economics or decision-making theory, I take the liberty of sketching a simple decision making model from first principles.

Even with a high level of risk awareness, the following must still be given for a conscious correct behavior in a given risky situation:

  1. Receive the stimulus in a given situation or condition that could lead up to a risk event.
  2. Recognize the stimulus to fall under the risk category.
  3. Know what is the appropriate behavior under these circumstances.
  4. Be capable (physically, mentally, equipped with necessary resources…) of taking the appropriate action.
  5. Be motivated to take the appropriate action.
  6. Actually take the action.

It is obvious that there are many different factors and conditions leading up to a conscious decision to exhibit correct and compliant behavior. And for each of the factors we can think of different failure modes.

Awareness is just one of those factors. So while awareness is correlated to desired behavior, it is neither a necessary nor a sufficient (think of correct behavior by chance or intuition) condition for it.

If we look more deeply into behavioral sciences (e.g. to the Behavioral Insights Team [4]) we find an even more complex framework of the process from “noticing” through “deliberation” to “executing”, with eight cognitive biases influencing behavior,

The limits of working memory

In addition, even assuming “fairly rational” actors, it is hardly possible to demand that management be aware of any and all possible risks at any given moment and still do a good job at fulfilling their actual tasks.

Neuroscience tells us that human beings can keep only a very limited number (7 plus/minus 2 accord to Miller’s Law) of ideas in their conscious working mind at the same time. [5] This of course limits the number of risks that a manager can be aware of at a single moment, while still doing her actual work which means making decisions and taking actions towards achieving her business objectives. And this number will decrease with stress, time pressure etc.

Let’s keep in mind that management’s primary responsibility is doing business and not being constantly aware of all risks imaginable.

But if not awareness – then what?

True competence is unawarene

According to Noel Burch’s “Four Stages for Learning Any New Skill” [6] there is a hierarchy of competence in acquiring and applying a new skill:

  1. Unconscious incompetence
  2. Conscious incompetence
  3. Conscious competence
  4. Unconscious competence

Think back to learning to play the piano, learning to speak a foreign language or to drive a car. First you are not aware if how bad you are, and what picking up the skill actually implies (stage 1). After realizing how difficult it is (stage 2), you start making progress and can apply the skill with concentration, making less and less mistakes (stage 3), until you master it without active concentration and can e.g. play a piano piece by heart while reading a comic book (like my little sister did when we were kids), form new sentences in Turkish intuitively or drive while engaged in a conversation with your passenger.

Applied to risk, control and compliance awareness, I believe we need to look less into awareness and related training and certification rituals. Rather, the classical GRC professions need to embrace the insights from social sciences, in particular psychology and behavioral economics, to examine ways to influence and support agents to make a majority of good decisions in an unconscious way; using choice architecture, nudging, algorithms, …

In addition, we need to continue leveraging technology and analytics to improve predictive monitoring capabilities and early warning systems in order to trigger alerts to direct decision makers’ attention to risks that matter at the time and place of decision making.

Think of driving a car: the cockpit shows a variety of metrics and indicators like the current speed, amount of fuel in the tank, exterior temperature … As long as everything is in a “normal” range of operation, you as the driver are managing the achievement of your objective (arrive at your destination within a certain time window) mostly unconsciously; making decisions to accelerate, brake, steer etc.

Then there are plenty of sensors and warning mechanisms to alert you to risks: distance sensors warn you of close proximity of other cars or objects, temperature sensors alert you to low outside temperature where the road could become slippery from ice, an indicator light warns you when your fuel level is very low; and regarding the big picture (achieving the objective of reaching your destination, your SatNav (or Google Maps on your phone) indicates the turns you need to take, displays ETA, identifies congestions on your route and alternative routes as choice alternatives (opportunities).

All of these decision support systems leave you as the driver free to drive mostly unconsciously; but they raise and focus your attention on the risks / opportunities and the decisions that matter for achieving your goals.

Conclusion

While I don’t deny the importance of risk awareness, I believe that it is over-emphasized in the way that GRC professionals view management decision making and that important insights from behavioral sciences need to be taken into account and leveraged in addition (see e.g. [7] for behavioral compliance or [8] for IT Security). By using algorithms and predictive monitoring based on (BIG) data analysis, management attention should be focused on key risks and decisions that matter while enabling steering lower level risks without requiring management awareness for each and everything that could possibly go wrong.

Wait! Before you leave …

If you liked this article, you may want to read also these posts:

Risk exaggeration – a cognitive bias case study

Naturally biased – why internal auditors cannot adhere to their own code of ethics

The limits of our language…

References:

[1] www.coso.org

[2] How to Make Information Security Awareness Relevant

[3] Wikipedia

[4] Behavioral Insights Team

[5] Wikipedia

[6] Wikipedia

[7] Behavioral Ethics, behavioral compliance

[8] A Composite Framework for Behavioral Compliance with Information Security Policies

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s