Security Awareness Program Jigglers

Does Your Security Awareness Program Need a Jiggler?

How much information security do you need?
How much is too much? How much is too little?

We've heard security consultants say that if nothing goes wrong, you have too much security. If something goes wrong, you had too little. This answer tempts management to embrace the "No news is good news" information security philosophy. But that's not always true. If you're not hearing about information security concerns, your security program may be running too smoothly.

Staff and contractors may be attending the security awareness presentations each year (hearing, but not listening), nodding (or nodding off) at the appropriate moments. They return to their day-to-day work with no change in their behaviors, and no increased recognition regarding the importance of security and the information they are responsible for protecting.

Briefings on information security usually start with how much we rely on technology and the dangers that await those who are unwary, unwise, or careless... This message can be over-used causing learners to become complacent or bored until something tragic happens. When someone says, "It won't happen here," it probably already has.

The greatest threats are mundane, not exotic: accidents, errors, and omissions, low-tech insider crime and operator error made possible by poor procedures, lack of discipline (testing of backups, choosing strong passwords), and lack of awareness of basic risks and countermeasures. Often, a significant threat is the lack of belief that any threat exists.

If this is the situation at your organization, your security program may need a jiggler.


Outdated radar with caption - does your security awareness program need a jiggler?

What 's a Jiggler?

In his book, "Secrets of Consulting," Jerry Weinburg tells a story to illustrate the functions of a jiggler. As electronic systems become more complex they begin to act like living systems. Early radar systems were like animals that exist in the wild but can not be bred or kept alive in captivity. Like wild animals, the first radar systems would work under combat conditions, but not in the laboratory.

Before World War II, people didn't understand that a large complex radar system could be so well managed and function so smoothly, that in an overly controlled and predictable environment, it could get "stuck." Radar systems depend on a noisy environment. To solve the problem of radar systems "sticking," random motion generators were attached to the mounting racks to break up the stable states in which the equipment had a tendency to get stuck. These generators were called "jigglers."

The same principle applies to security awareness programs. Sometimes it takes an outsider, a security breach, or even a disaster, to get management to see that the program has become stuck and no longer reacts to potential incidents.

Almost every incident is preceded by a series of trivial events that form a chain. A break in any link could prevent the incident. Staff must be trained to recognize potential incidents and break the chain.

Ideas for Jiggling Your Security Program

  • Look for any outside agent entering your organization to act as a jiggler. This could be a new employee who asks questions during the security orientation, a manager who has changed functions or responsibilities, or an independent reviewer - anyone with a fresh perspective of your security environment.
  • Ask your staff how they would break into the system; the people closest to it ought to know the vulnerabilities.
  • Ask your staff to consider the question, "Are data security breaches predictable?"
  • An organization where staff are ignorant of information security or where security responsibilities are not clearly assigned is a breeding ground for security incidents. If you don't give a security task to someone, it won't get done. If security reports are not taken seriously, people will stop reporting.
  • If basic security practices and procedures are not up-to-date or followed, adverse events are more likely to occur.

Gavin de Becker, the nation's leading predictor of violent behavior, writes:

The failure to take the obvious steps of calling references is an epidemic in America. A common excuse is that references will say only good things about the candidate since the candidate has prepared them for the call. In fact, there is a tremendous amount of information that can be gained from references in terms of confirming facts on the application.

He cites a case where several people were seriously injured by an employee's intentional actions. In reviewing the situation, he discovered a poorly done pre-employment inquiry. For example, home telephone numbers of applicant's relatives were listed as previous places of employment, references were not called, and information on the application was not confirmed.

Enlightened management recognizes the value of a jiggler to stir things up and look at the way things are done from a fresh perspective. These managers know that measuring risks is not as important as avoiding risks, and one of the best ways to avoid risks is to continually review and improve the information security program.

Security program managers must not stand on the train tracks calculating the speed of the on-coming train. They must move their organizations toward better information security practices and procedures that are routinely "tweaked" by a jiggler.

In-House Skeptics

Did you know? The U.S. Army runs an internal disruption unit: the University of Foreign Military and Cultural Studies based in Fort Leavenworth, Kansas, also dubbed "Red Team University." The school's curriculum is designed to combat institutional groupthink in the military by empowering and training in-house skeptics.