The RISKS Digest
Volume 15 Issue 16

Tuesday, 19th October 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Social Psychology & INFOSEC
Mich Kabay
Info on RISKS (comp.risks)

Social Psychology & INFOSEC

"Mich Kabay / JINBU Corp." <75300.3232@compuserve.com>
19 Oct 93 05:15:29 EDT
SOCIAL PSYCHOLOGY AND INFOSEC:
Psycho-Social Factors in the Implementation of Information Security Policy

M. E. Kabay, Ph.D.

Director of Education, National Computer Security Association
Carlisle, PA

President, JINBU Corporation
P. O. Box 509 / Westmount, QC
H3Z 2T6 Canada
Internet: 75300.3232@compuserve.com


INTRODUCTION

Security policies and procedures affect not only what people do but also
how they see themselves, their colleagues and their world.  Despite these
psychosocial issues, security personnel pay little or no attention to what
is known about social psychology.  The established principles of human
social behaviour have much to teach us in our attempts to improve corporate
and institutional information security.

Information security specialists concur that security depends on people
more than on technology.  Another commonplace is that employees are a far
greater threat to information security than outsiders.

It follows from these observations that improving security depends on
changing beliefs, attitudes and behaviour, both of individuals and of
groups.  Social psychology can help us understand how best to work with
human predilections and predispositions to achieve our goals of improving
security:

o   research on social cognition looks at how people form impressions about
reality (knowing these principles, we can better teach our colleagues and
clients about effective security);

o   work on attitude formation and beliefs helps us present information
effectively and so convince employees and others to cooperate in improving
security;

o   scientists studying persuasion and attitude change have learned how best
to change people's minds about unpopular views such as those of the
security community;

o   studies of factors enhancing prosocial behaviour provide insights on how
to foster an environment where corporate information is willingly
protected;

o   knowledge of the phenomena underlying conformity, compliance and
obedience can help us enhance security by encouraging compliance and by
protecting staff against social pressure to breach security;



o   group psychology research provides warnings about group pathology and
hints for working better with groups in establishing and maintaining
information security in the face of ingrained resistance.

The following discussion is based on well-established principles of social
psychology.  Any recent introductory college textbook in this field will
provide references to the research that has led to the principles which are
applied to security policy implementation.  In this paper, references are
to Lippa, R A (1990).  Introduction to Social Psychology. Wadsworth
(Belmont, CA).  ISBN 0-534-11772-4.


SOCIAL COGNITION

Schemas are self-consistent views of reality.  They help us pay attention
to what we expect to be important and to ignore irrelevant data.  They also
help us organize our behaviour [Lippa, p. 141].  For example, our schema
for relations at the office includes polite greetings, civil discussions,
written communications, and businesslike clothes.  The schema excludes
obscene shrieks, abusive verbal attacks, spray-painted graffiti and
colleagues dressed in swim suits.  It is the schema that lets people tell
what is inappropriate in a given situation.

Security policies and procedures conflict with most people's schema.
Office workers' schema includes sharing office supplies ('Lend me your
stapler, please?'), trusting your team members to share information ('Take
a look at these figures, Sally'), and letting your papers stay openly
visible when you have to leave your desk.  Unfortunately, sharing user IDs,
showing sensitive information to someone who lacks the appropriate
clearance, and leaving work stations logged on without protection are gross
breaches of a different schema.  Normal politeness dictates that when a
colleague approaches the door we have just opened, we hold the door open
for them; when we see a visitor, we smile politely (who knows, it may be a
customer).  In contrast, access policies require that we refuse to let even
a well-liked colleague piggy-back their way through an access-card system;
security policies insist that unbadged strangers be challenged or reported
to security personnel.  Common sense tells us that when the Chief Executive
Officer of the company wants something, we do it; yet we try to train
computer room operators to forbid entry to anyone without documented
authorization--including the CEO.

Schemas influence what we perceive [Lippa, p. 143].  For example, an
employee refuses to take vacations, works late every night, is never late,
and is never sick.  A model employee?  Perhaps, from one point of view.
>From the security point of view, the employee's behaviour is suspect.
There have been cases where such people have actually been embezzlers
unable to leave their employment: even a day away might result in
discovery.  Saint or sinner?  Our expectations determine what we see.

Schemas influence what we remember [Lippa, p. 145].  When information
inconsistent with our preconceptions is mixed with details that fit our
existing schemas, we selectively retain what fits and discard what
conflicts.  When we have been fed a diet of movies and television shows
illustrating the premise that information is most at risk from brilliant
hackers, why should we remember the truth--that carelessness and
incompetence by authorized users of information systems cause far more harm
than evil intentions and outsiders ever do.

Before attempting to implement policies and procedures, we should ensure
that we build up a consistent view of information security among our
colleagues.  In light of the complexity of social cognition, our usual
attempts to implement security policies and procedures seem pathetically
inept.  A couple of hours of lectures followed by a video, a yearly ritual
of signing a security policy that seems to have been written by
Martians--these are not methods that will improve security.  These are
merely lip service to the idea of security.

According to research on counter-intuitive information, people's judgement
is influenced by the manner in which information is presented.  For
example, even information contrary to established schemas can be
assimilated if people have enough time to integrate the new knowledge into
their world-views [Lippa, p. 148].  It follows that security policies
should be introduced over a long time, not rushed into place.

Preliminary information may influence people's responses to information
presented later.  For example, merely exposing experimental subjects to a
list of words such as `reckless' or `adventurous' affects their judgement
of risk-taking behaviour in a later test.  It follows that when preparing
to increase employee awareness of security issues, presenting case-studies
is likely to have a beneficial effect on participants' readiness to examine
security requirements.

Pre-existing schemas can be challenged by several counter-examples, each of
which challenges a component of the schema [Lippa, p. 153].  For example,
prejudice about an ethnic group is more likely to be changed by contact
with several people, each of whom contradicts a different aspect of the
prejudiced schema.  It follows that security awareness programs should
include many realistic examples of security requirements and breaches.
Students in the NCSA's Information Systems Security Course have commented
on the unrealistic scenario in a training video they are shown; a series of
disastrous security breaches occur in the same company.  Based on the
findings of cognitive social psychologists, the film would be more
effective for training if the incidents were dramatized as occurring in
different companies.

Judgements are easily distorted by the tendency to rely on personal
anecdotes, small samples, easily available information, and faulty
interpretation of statistical information [Lippa, p. 155-163].  Basically,
we humans are not rational processors of factual information.  If security
awareness programs rely strictly on presentation of factual information
about risks and proposed policies and procedures, they will run up against
our stubborn refusal to act logically.  Security program implementation
must engage more than the rational mind.  We must appeal to our colleagues'
imagination and emotion as well.  We must inspire a commitment to security
rather than merely describing it.

Perceptions of risks and benefits are profoundly influenced by the wording
in which situations and options are presented [Lippa, p. 163].  For
example, experimental subjects responded far more positively to reports of
a drug with `50% success' than to the same drug described as having `50%
failure.' It follows that practitioners should choose their language
carefully during security awareness campaigns.  Instead of focusing on
reducing failure rates (breaches of security), we should emphasize
improvements of our success rate.


BELIEFS AND ATTITUDES

Psychologists distinguish between beliefs and attitudes.  `A belief ...
refers to cognitive information that need not have an emotional
component....' An attitude refers to `an evaluation or emotional
response....' [Lippa, p. 238].  Thus a person may believe that copying
software without authorization is a felony while nonetheless having the
attitude that it doesn't matter.

Beliefs can change when contradictory information is presented, but some
research suggests that it can take up to a week before significant shifts
are measurable.  Other studies suggest that when people hold contradictory
beliefs, providing an opportunity to articulate and evaluate those beliefs
may lead to changes that reduce inconsistency.  These findings imply that a
new concern for corporate security must be created by exploring the current
structure of beliefs among employees and managers.  Questionnaires, focus
groups, and interviews may not only help the security practitioner, they
may actually help move the corporate culture in the right direction.

An attitude, in the classical definition, `is a learned evaluative
response, directed at specific objects, which is relatively enduring and
influences behaviour in a generally motivating way' [Lippa, p. 221].  The
advertising industry spends over $50B yearly to influence public attitudes
in the hope that these attitudes will lead to changes in spending
habits--that is, in behaviour.

Research on classical conditioning suggests that attitudes can be learned
even because of simple word association [Lippa, p. 232].  If we wish to
move our colleagues towards a more negative view of computer criminals, it
is important not to portray computer crime using positive images and words.
Movies like Sneakers may do harm indirectly by associating pleasant,
likeable people with techniques that are used for industrial espionage.
When teaching security courses, we should avoid praising the criminals we
describe in case studies.

One theory on how attitudes are learned suggests that rewards and
punishments are important motivators.  Studies show that even apparently
minor encouragement can influence attitudes.  A supervisor or instructor
should praise any comments that are critical of computer crime or which
support the established security policies.  Employees who dismiss security
concerns or flout the regulations should be challenged on their attitudes,
not ignored.


PERSUASION AND ATTITUDE CHANGE

Persuasion--changing someone's attitudes--has been described in a terms of
communications [Lippa, p. 258].  The four areas of research include

o   communicator variables: who is trying to persuade?

o   message variables: what is being presented?

o   channel variables: by what means is the attempt taking place?

o   audience variables: at whom is the persuasion aimed?

Attractiveness, credibility and social status have strong effects
immediately after the speaker or writer has communicated with the target
audience; however, over a period of weeks to a month, the effects decline
until the predominant issue is message content.  We can use this phenomenon
by identifying the senior executives most likely to succeed in setting a
positive tone for subsequent security training.  We should look for
respected, likeable people who understand the issues and sincerely believe
in the policies they are advocating.

Fear can work to change attitudes only if judiciously applied.  Excessive
emphasis on the terrible results of poor security is likely to backfire,
with participants in the awareness program rejecting the message
altogether.  Frightening consequences should be coupled immediately with
effective and achievable security measures.

Some studies suggest that presenting a balanced argument helps convince
those who initially disagree with a proposal.  Presenting objections to a
proposal and offering counter-arguments is more effective than one-sided
diatribes.  The Software Publishers' Association training video, It's Just
Not Worth the Risk, uses this technique: it shows several members of a
company arguing over copyright infringement and fairly presents the
arguments of software thieves before demolishing them.

Modest repetition of a message can help generate a more positive response.
Thus security awareness programs which include imaginative posters, mugs,
special newsletters, audio and video tapes and lectures are more likely to
build and sustain support for security than occasional intense sessions of
indoctrination.

The channel through which we communicate has a strong effect on attitudes
and on the importance of superficial attributes of the communicator.
`Face-to-face persuasion often proves to have more impact than persuasion
through the mass media....  [because they] are more salient, personal and
attention-grabbing, and thus they often stimulate more thought and
commitment to their persuasive messages' [Lippa, p. 264].  Security
training should include more than tapes and books; a charismatic teacher or
leader can help generate enthusiasm for--or at least reduce resistance
to--better security.

Workers testing cognitive response theory [Lippa, p. 289] have studied many
subtle aspects of persuasion.  For example, experiments have shown that
rhetorical questions (e.g., `Are we to accept invasions of our computer
systems?') are effective when the arguments are solid but
counter-productive when arguments are weak.

In comparing the central route to persuasion (i.e., consideration of facts
and logical arguments) with the peripheral (i.e., influences from logically
unrelated factors such as physical attractiveness of a speaker),
researchers find that the central route `leads to more lasting attitudes
and attitude changes....' [Lippa, p. 293].

As mentioned above, questionnaires and interviews may help cement a
favourable change in attitude by leading to commitment.  Once employees
have publicly avowed support for better security, some will begin to change
their perception of themselves.  As a teacher of information security, I
find that I now feel much more strongly about computer crime and security
than I did before I created my courses.  We should encourage specific
employees to take on public responsibility for information security within
their work group.  This role should periodically be rotated among the
employees to give everyone the experience of public commitment to improved
security.


PROSOCIAL BEHAVIOUR

Studies of how and why people help other people have lessons for us as we
work to encourage everyone in our organizations to do the right thing.  Why
do some people intervene to stop crimes?  Why do others ignore crimes or
watch passively?  Latane and Darley (Lippa, p. 493) have devised a schema
that describes the steps leading to prosocial behaviour:

o   People have to notice the emergency or the crime before they can act.
Thus security training has to include information on how to tell that
someone may be engaging in computer crime.

o   The situation has to be defined as an emergency--something requiring
action.  Security training that provides facts about the effects of
computer crime on society and solid information about the need for security
within the organization can help employees recognize security violations as
emergencies.


o   We must take responsibility for acting.  The bystander effect comes into
play at this stage.  The larger the number of people in a group confronted
with an emergency, the slower the average response time.  Larger groups
seem to lead `to a diffusion of responsibility whereby each person felt
less personally responsible for dealing with the emergency' [Lippa, p.
497].  Another possible factor is uncertainty about the social climate;
people fear `appearing foolish or overly emotional in the eyes of those
present.'  We can address this component of the process by providing a
corporate culture which rewards responsible behaviour such as reporting
security violations.

o   Having taken responsibility for solving a problem, we must decide on
action.  Clearly written security policies and procedures will make it more
likely that employees act to improve security.  In contrast, contradictory
policies, poorly-documented procedures, and inconsistent support from
management will interfere with the decision to act.

Another analysis proposes that people implicitly analyze costs of helping
and of not helping when deciding whether to act prosocially.  The
combination of factors most conducive to prosociality is low cost for
helping and high cost for not helping.  Security procedures should make it
easy to act in accordance with security policy; e.g., there should be a
hot-line for reporting security violations, anonymity should be respected
if desired, and psychological counselling and followup should be available
if people feel upset about their involvement.  Conversely, failing to act
responsibly should be a serious matter; personnel policies should document
clear and meaningful sanctions for failing to act when a security violation
is observed; e.g., inclusion of critical remarks in employment reviews and
even dismissal.

One method that does not work to increase prosocial behaviour is
exhortation [Lippa, p. 513].  That is, merely lecturing people has little
or no effect.  On the other hand, the general level of stress and pressure
to focus on narrow tasks can significantly reduce the likelihood that
people will act on their moral and ethical principles.  Security is likely
to flourish in an environment that provides sufficient time and support for
employees to work professionally; offices where everyone responds to
self-defined emergencies all the time will not likely pay attention to
security violations.

Some findings from research confirm common sense.  For example, guilt
motivates people to act more prosocially.  This effect works best `when
people are forced to assume responsibility....'  Thus enforcing standards
of security using reprimands and sanctions can indeed increase the
likelihood that employees will subsequently act more cooperatively.  In
addition, mood affects susceptibility to prosocial pressures:  bad moods
make prosocial behaviour less likely, whereas good moods increase
prosociality.  A working environment in which employees are respected is
more conducive to good security than one which devalues and abuses them.
Even cursory acquaintance with other people makes it more likely that we
will help them; it thus makes sense for security supervisors to get to know
the staff from whom they need support.  Encouraging social activities in an
office (lunch groups, occasional parties, charitable projects) enhances
interpersonal relationships and can improve the climate for effective
security training.


CONFORMITY, COMPLIANCE AND OBEDIENCE

Turning a group into a community provides a framework in which social
pressures can operate to improve our organization's information security.
People respond to the opinions of others by (sometimes unconsciously)
shifting their opinion towards the mode.  Security programs must aim to
shift the normative values (the sense of what one should do) towards
confidentiality, integrity and availability of data.  As we have seen in
public campaigns aimed at reducing drunk driving, it is possible to shift
the mode.  Twenty years ago, many people believed that driving while
intoxicated was amusing; today a drunk driver is a social pariah.  We must
move towards making computer crime as distasteful as public drunkenness.

The trend towards conformity increases when people within the group like or
admire each other [Lippa, p. 534].  In addition, the social status of an
individual within a group influences that individual's willingness to
conform.  High-status people (those liked by most people in the group) and
low-status people (those disliked by the group) both tend to more
autonomous and less compliant than people liked by some and disliked by
others [Lippa, p. 536].  Therefore the security officers should pay special
attention to those outliers during instruction programs.  Managers should
monitor compliance more closely in both ends of the popularity range.
Contrariwise, if security practises are currently poor and we want allies
in changing the norm, we should work with the outliers to resist the herd's
anti-security bias.

'The norm of reciprocity holds that we should return favours in social
relations' [Lippa, p. 546].  Even a small, unexpected or unsolicited (and
even unwanted) present increases the likelihood that we will respond to
requests.  A security awareness program that includes small gifts such as a
mug labelled `SECURITY IS EVERYONE'S BUSINESS' or an inexpensive booklet
such as the Information Systems Security Pocket Guide (available from the
NCSA) can help get people involved in security.

The `foot in the door' technique suggests that we `follow a small initial
request with a much larger second request' [Lippa, p. 549].  For example,
we can personally ask an employee to set a good example by blanking their
screen and locking their terminal when they leave their desk.  Later, once
they have begun their process of redefinition of themselves ('I am a person
who cares about computer security'), we can ask them for something more
intense, such as participating in security training for others (e.g.,
asking each colleague to blank their screen and lock their terminal).

GROUP BEHAVIOUR

Early studies on the effects of being in groups produced contradictory
behaviour; sometimes people did better at their tasks when there were other
people around and sometimes they did worse.  Eventually, social
psychologist Robert Zajonc [Lippa, p. 572 ff.] realized that `The presence
of others is arousing, and this arousal facilitates dominant, well-learned
habits but inhibits nondominant, poorly-learned habits.'  Thus when trying
to teach employees new habits, it is counter-productive to put them into
large groups.  Individualized learning (e.g., computer-based training,
video tapes) can overcome the inhibitory effect of groups in the early
stages of behavioural change.

Another branch of research in group psychology deals with group
polarization.  Groups tend to take more extreme decisions than individuals
in the group would have [Lippa, p. 584].  In group discussions of the need
for security, polarization can involve deciding to take more risks--by
reducing or ignoring security concerns--than any individual would have
judged reasonable.  Again, one-on-one discussions of the need for security
may be a more effective approach to building a consensus that supports
cost-effective security provisions than large meetings.

In the extreme, a group can display groupthink, in which a consensus is
reached because of strong desires for social cohesion [Lippa, p. 586 ff.].
When groupthink prevails, evidence contrary to the received view is
discounted; opposition is viewed as disloyal; dissenters are discredited.
Especially worrisome for security professionals, people in the grip of
groupthink tend to ignore risks and contingencies. To prevent such
aberrations, the leader must remain impartial and encourage open debate.
Experts from the outside (e.g., respected security consultants) should be
invited to address the group, bringing their own experience to bear on the
group's requirements. After a consensus has been achieved, the group should
meet again and focus on playing devil's advocate to try to come up with
additional challenges and alternatives.

CONCLUSIONS

By viewing information security as primarily a management issue, we can
benefit from the mass of knowledge accumulated by social psychologists.  We
can implement security policies and procedures more easily by adapting our
training and awareness techniques to correspond to human patterns of
learning and compliance.


SUMMARY OF RECOMMENDATIONS


1.  Before attempting to implement policies and procedures, we should ensure
that we build up a consistent view of information security among our
colleagues.

2.  Security policies should be introduced over a long time, not rushed into
place.

3.  Presenting case-studies is likely to have a beneficial effect on
participants' readiness to examine security requirements.

4.  Security awareness programs should include many realistic examples of
security requirements and breaches.

5.  We must inspire a commitment to security rather than merely describing
it.

6.  Emphasize improvements rather than reduction of failure.

7.  A new concern for corporate security must be created by exploring the
current structure of beliefs among employees and managers.

8.  Do not to portray computer crime using positive images and words.

9.  Praise any comments that are critical of computer crime or which support
the established security policies.

10. Employees who dismiss security concerns or flout the regulations should
be challenged on their attitudes, not ignored.

11. Identify the senior executives most likely to succeed in setting a
positive tone for subsequent security training.

12. Frightening consequences should be coupled immediately with effective
and achievable security measures.

13. Presenting objections to a proposal and offering counter-arguments is
more effective than one-sided diatribes.

14. Security awareness programs should include repeated novel reminders of
security issues.

15. In addition to tapes and books, rely on a charismatic teacher or leader
to help generate enthusiasm for better security.

16. Encourage specific employees to take on public responsibility for
information security within their work group.

17. Rotate the security role periodically.

18. Security training should include information on how to tell that
someone may be engaging in computer crime.

19. Build a corporate culture which rewards responsible behaviour such as
reporting security violations.

20. Develop clearly written security policies and procedures.

21. Security procedures should make it easy to act in accordance with
security policy.

22. Failing to act in accordance with security policies and procedures
should be a serious matter.

23. Enforcing standards of security can increase the likelihood that
employees will subsequently act more cooperatively.

24. A working environment in which employees are respected is more
conducive to good security than one which devalues and abuses them.

25. Security supervisors should get to know the staff from whom they need
support.

26. Encourage social activities in the office.

27. Pay special attention to social outliers during instruction programs.

28. Monitor compliance more closely in both ends of the popularity range.

29. Work with the outliers to resist the herd's anti-security bias.

30. Include small gifts in your security awareness program.

31. Start improving security a little at a time and work up to more
intrusive procedures.

32. Before discussing security at a meeting, have one-on-one discussions
with the participants.

33. Remain impartial and encourage open debate in security meetings.

34. Bring in experts from the outside when faced with groupthink.

35. Meet again after a consensus has been build and play devil's advocate.

Please report problems with the web pages to the maintainer

x
Top