Red Teaming Your Information Governance Program

July 13th, 2017

George Despres, CRM
Program Director for University Records Management, Brandeis University

(The content in this blog reflects the opinions of the author, and not of Brandeis University.)

About Red Teaming

Have you deliberately challenged your own program plans and procedures recently? With a book on the topic, Bryce Hoffman defines red teaming as:

“ a… way to stress-test strategies, flush out unseen threats and missed opportunities, and execute more successfully…. [by] making critical and contrarian thinking part of the planning process.”

Red teaming (RT), with origins in the military, involves thinking differently and more objectively about your program, looking under rocks, and grasping alternative views of the way things are. “Murder boarding,” or grilling an idea or presenter before actual product delivery, is a similar approach to cleaning up planning and execution. However, RT acts upon live environments, not just rehearsals. And while Six Sigma identifies errors and deviations in programs, RT goes beyond this by also surfacing alternative and novel ideas, observations, and ways for improving things.

Our mission is greatly challenged by the dynamics of info generation and by rapid changes to info environments. Faced with these challenges, it is easy to fall into real-time, project-to-project, status quo thinking without thoroughly questioning where we are and where we are going. Program successes can render us dangerously comfortable. Professional or departmental groupthink can likewise soften our ability to evolve in changing environments. We must beware subtle complacency by stopping and kicking the tires in order to harden our programs.

RT philosophy is familiar to the cyber security community, which employs penetration (pen) testing and white hat, self-hacking activities. However, RT has applications across the IG spectrum, beyond info security. The following focuses on applying RT to these other areas of IG.


The Psychology of It All, or the “I’m Objective” Fallacy

Drawing on decision psychology research, Hoffman outlines categories of bias and natural thinking frameworks that can impede our view of actual situations, including:

  • Confirmation bias, where we give greater credibility to info that supports what we already believe (think politics).
  • Anchoring bias, where an initial value or offer ($50,000) narrows the range of possibilities (think Shark Tank).
  • Automation bias, where we fail to question automated system output after relying on it–this threat increases in our data analytics world.
  • Outcome bias, where we assume a positive outcome resulted from the right decision rather than from luck and other factors.
  • Status quo bias, which shouldn’t need any explanation here.

There are other types of bias, and RT brings a rigor and detached analysis that helps us to overcome them.


Executing Red Team Lite, or Lean Red Teaming

How do we apply RT to our programs? The solutions depend upon our resources, and I’ll assume that most of us don’t have much. Hoffman notes a range of possibilities, from hiring consultant teams or RT facilitators, to dedicated executive or rotating teams, to ad hoc RT using bits of red team philosophy. Participant selection should balance the needs of impartiality with enterprise and industry savvy. While whole standing teams are ideal, I’d suggest that “RT lite” or “lean RT” may be more realistically employed with scant resources. This can be performed through devil’s advocacy.

Devil’s advocates have long been paired with RT exercises. In his book Originals, Adam Grant shows that effective devil’s advocates need to be legitimate and sincere in their assessment of, or opposition to, the status quo–they shouldn’t be phony role players. The good news: most of us are armed with legitimate devil’s advocacy. How? By using the pushback that our programs actually face from naysayers and outright opponents within our organizations. The pains in our behinds–our biggest detractors–are actually doing us a favor and feeding us RT ammo.

The following are pushback comments that I and some of my colleagues have actually been confronted with. You will recognize them. Think up some of your own and add them to the list.

  • Digital storage is cheap. We’re good.
  • Why should we invest so much money in e-discovery preparedness when our business only takes one or two cases to trial per year?
  • Big data can make all sorts of correlations that we can’t imagine today. We need to keep all of our data lake content to be able to harness its potential value. Data that looks useless now could be critical in the future and pay off big time. There’s a baby in the bathwater.
  • [Senior executive:] You can’t expect me not to use WhatsApp when I’m doing business on the road.
  • We have plenty of locked basement spaces on our corporate campus. Why do we need to pay more for offsite vendor storage?
  • So you’ve published the retention schedule. Is everyone really implementing its policies consistently?
  • [IT manager:] We don’t need your input on the new ERP system. This is a technology solution.
  • My department’s (generic) file share is much easier to use than the document management system, which I hate.
  • We have cyber security in place. Why should we worry about other info management functions? Sounds like a “nice to have” that we can’t afford.

Now, take some time with smart colleagues and respond to each of these with respect to your IG or RIM program. Capture your responses, state each problem, brainstorm alternative possibilities, and thoroughly review them with your group. Draw up and assign actions based on this review.

Hopefully, this serves as a modest starting point for you to apply more red team thought and approaches to your specific program. There’s further RT reading, such as Hoffman’s book (never met him) and, that can help you to apply it in a more thorough manner if resources allow. At the very least, question your IG program, belief system, and biases often. This will reduce the possibility of getting blindsided. The passive perish and the complacent crumble.

Protected by Akismet
Blog with WordPress

Welcome Guest | Login (Brandeis Members Only)