0% found this document useful (0 votes)
1K views

Cultdynamicsnewest

The MIND Model of Cult Dynamics summarizes the key techniques used by cults to exert control over individuals. These include manipulation, indoctrination, negation, and deception. Manipulation involves techniques like impression management and lying to ensure compliance. Indoctrination is the process of deliberate environmental changes without consent to alter attitudes and behaviors. Negation devalues the individual through criticism to break down the ego. Deception involves a lack of informed consent and prevents critical thinking through emotions and isolation.

Uploaded by

api-336142556
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1K views

Cultdynamicsnewest

The MIND Model of Cult Dynamics summarizes the key techniques used by cults to exert control over individuals. These include manipulation, indoctrination, negation, and deception. Manipulation involves techniques like impression management and lying to ensure compliance. Indoctrination is the process of deliberate environmental changes without consent to alter attitudes and behaviors. Negation devalues the individual through criticism to break down the ego. Deception involves a lack of informed consent and prevents critical thinking through emotions and isolation.

Uploaded by

api-336142556
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 5

The MIND Model of Cult Dynamics

By Cathleen Mann, PhD

M: Manipulation. These are techniques used by cults to ensure


compliance by using undue influence. The definition of undue influence
has been recognized in common law for 500 years and is a legal
definition, not a psychological one. Manipulation can consist of a
variety of factors including those put forth by Cialdini (1984 ).
Manipulation also involves several other elements such as: impression
management; lying about facts and history; assuring conformity to a
teaching without question (Lifton, 1961); betraying of confidences;
denying reality; and changes to diet, sleeping patterns, and overactivity
(Schein, 1961). Hypnosis or other artificial techniques are not
necessary when ordinary techniques such as those mentioned above
are more than adequate. The confirmation of manipulation occurs
when ordinary cult members are successful in activity to convert others
(Kent, 2001).

I: Indoctrination. This is a process of deliberate changes to a person's


environment without consent, knowledge, or awareness (Zablocki,
2001). Indoctrination does not include personality change, only
attitudinal and behavior change. The changes are not permanent and
dissipate when the process of indoctrination ceases (Lifton, 1961;
Gallanter,1999). There is no research support for "snapping,"
"precult/postcult identities," or "sudden personality changes."
Personality is a fixed, permanent variable; only behavior changes.
Indoctrination begins with recruitment, binding an individual to the
group through ritual and secrets, creating a sense of specialness, and
replicating family bonds (Lifton, 1961; Satir, 1964), perverting social
controls such as innate prosocial attitudes such as respect for authority
and fear of negative consequences, among others (Bowbly, 1998;
Shermer, 1997; Kent, 2005).

N: Negation. A process of devaluing the individual and their past


through sustained criticism, often labelled as feedback or
disengagement. All successful cults downplay the ego and consider it
the ultimate enemy (Langone, 1986). Other forms of negation include
triangulation (Satir, 1964), the silent treatment, lack of or inconsistent
reinforcement, rejection, questioning of motives, etc. Any cult failures
are the result of improper group dynamics (Festinger, 1956; Kent, 2001;
Ofshe, 1992).

D: Deception. Lack of informed consent (Routh, 1994). Successful cults


use deception in a wide variety of forms. Without deception, no one
would affiliate or stay. Termed the true hallmark of a cult, deception
prevents critical thinking and good decision making (Layton, 1998).
Deception is not prevented by intelligence or rational thought, but is
maintained by emotions, fear, and isolation. It is a temporary betrayal of
self (Lifton, 1993) without awareness of the reasons driving it.
Deception occurs in a pyramid fashion where those above know more
than those below, and leaders at the top restrict knowledge through the
use of loyalty tests to climb higher in the pyramid. Deception is also
detailed in the article by Langone , where he shows with great clarity,
the interplay of the three D's: deception, dependence, and dread.
This is more accurate than the sensationalized term, "phobia
indoctrination," which does not capture the process of leaving a cult.

References

Bowlby, J. (1988). A secure base. New York: Basic Books

Cialdini, R. (1984). The psychology of influence. New York: William


Morrow.

Festinger, L; Riecken, H.W.; Schachter, S. (1956). When Prophecy


Fails: A Social and Psychological Study of a Modern Group that
Predicted the Destruction of the World. Minnesota: University of
Minnesota Press.

Galanter, M. (1999). Cults: faith, healing and coercion (2nd ed.). Oxford:
Oxford University Press.

Kent, S. A. (2001). From slogans to mantras: social protest and religious


conversion in the late Vietnam War era. New York: Syracuse
University Press.

Kent, S.A. (2005). Education and Re-education in Ideological


Organizations and Their Implications for Children. Cultic Studies
Review 4 (2).
Langone, M.D. (1986). Cultism and American culture. Cultic Studies
Journal, 3, 157-172.

Langone, M.D. (1993). (Ed.) Recovery from cults: help for victims of
psychological and spiritual abuse. New York: W.W. Norton &
Company.

Layton, D. (1998). Seductive poison: Jonestown survivors story of life


and death in the Peoples Temple. New York: Anchor Books.

Lifton, R. J. (1961). Thought reform and the psychology of totalism.


New York: W.W.Norton & Company.

Lifton, R.J. (1993). The protean self: Human resilience in an age of


fragmentation. New York: Basic Books.

Ofshe, R. (1992). Coercive persuasion and attitude change. Encyclopedia


of Sociology, eds. E. & M. Borgatta, New York: MacMillan; 212-
224.

Routh, D. K. (1994). Clinical psychology since 1917: Science, practice,


and organization. New York: Plenum.

Satir, V. (1964). Conjoint family therapy: A guide to theory and


technique. Palo Alto, CA: Science and Behavior.

Schein, E. H. (1961). Coercive persuasion; a socio-psychological analysis


of the brainwashing of American civilian prisoners by the Chinese
Communists. New York: W. W. Norton.

Shermer, Michael (1997). Why People Believe Weird Things. New York:
W.H. Freeman.
Zablocki, B.D. (2001). (Ed.) Misunderstanding cults: Searching for
objectivity in a controversial field. Toronto: University of Toronto
Press.

You might also like