Saturday 28 July 2012
Robert Jay Lifton and the self-directed cult
Robert Jay Lifton has researched a remarkable diversity of cults, with an enduring focus on methods of mind control, from the techniques used in the universities of Communist China in the 1950s, to the methods of the Japanese cult Aum Shinrikyo (Youtube has immortalized a lecture of his on this subject). That, in itself, is interesting to me: my own work has dealt with both Buddhism and Communism, but I can't say that I have (yet) devoted much attention to the cult aspects of either one.
Lifton's approach has made the central question of cult analysis the paradox of how it is (procedurally) that sane adults lose their own powers of decision-making. I do not say this as a criticism, but merely to point out that the centrality of brainwashing reflects Lifton's professional interest in the witness to history as a patient in recovery thereafter. In contrast to an historian examining the same phenomena, Lifton is naturally more interested in changes that transpire when people escape from these cults, and regain their powers of decision-making, and reflect back on the prior events. In terms of the history of politics and/or/as religion, this puts an unusual emphasis on personal motivations, and subjective perceptions of institutional goals (as opposed to outcomes). Lifton is a psychiatrist, not an archivist, and, accordingly, he deals with a separate set of questions in looking at the history of China than, say, Frank Dikötter (in Mao's Great Famine).
That isn't a problem, but it is something interesting for me to reflect on, precisely because my own approach to history (having variously fallen under the headings of political science, philology, or otherwise) would naturally tend to marginalize the most blatantly cultic aspects. I also do not interview people in the status of patients, nor in a process of recovery; in almost any other setting, people who are critical of Communism in the present tense are likely to describe themselves (and the history they witnessed) as if they had been critics all along. Likewise, people disillusioned with Buddhist cults (or Buddhism in general) will tend to retrospectively insert critical attitudes into their narratives. In this way, the true-believer disappears from the recollection of history, although, in fact, the mentality of the true-believer is often something of decisive importance in understanding why the events happened in the first place.
For cultural outsiders (such as myself) discussing the history of the Khmer Rouge (formally writ Democratic Kampuchea) the last thing on anyone's mind would be the mentality of the true-believers (i.e., the attitudes or beliefs of the people who sincerely subscribed to the cult of Pol Pot, and who really believed that the regime would deliver on its promised goals). There is a natural tendency to either focus on the suffering of the great majority of the people, or else to give voice to the sorrows of the enemies of the regime who were persecuted. The minority (be it large or small) who really believed in the doctrine of the cult are marginalized by every approach to the history except for Lifton's school of cult-analysis. In many contexts, it would seem startling and perverse if someone were to approach the same chapter of history in terms of the suffering of the true-believers --and in terms of their psychological recovery after leaving the cult (i.e., their sorrows in losing their faith in Pol Pot).
The study of cult methods does have some significant applications to Buddhism in the 21st century, but there are also limitations that result from the (natural and inevitable) preoccupation of the discipline with apocalyptic cults in particular, and with cults that exercise a great deal of hierarchical control over their members (both in terms of suasion and coercion) rather than those that allow followers a high degree of self-direction.
I do not think that the concept of a self-directed cult is contradictory; it seems to be one of the principle refinements of the religious mentality, i.e., that the devotee takes responsibility for their own devotions. Perhaps this aspect is clearer in various forms of Buddhism that do not shift responsibility onto a monotheistic deity (nor a magical agent of fate, etc.) and more clearly burden the practitioner with doing the work of their own salvation. What is perhaps most sad (and most significant) about the cult mentality is that it can arise in the absence of coercion: people can (and do) lose the ability to make their own decisions without someone else taking it from them.
One of the more obvious elements of Lifton's influence is the direct imitation of his numbered lists (iterating "criteria" for the cult mentality, often linked to concepts that are important for treatment and recovery therafter). I have now seen several authors propose their own lists of the distinctive aspects of cult mentality (often as observed amongst the members of a specific cult) in contrast to (or expanding upon) Lifton's list in his 1961 tome, Thought Reform and the Psychology of Totalism. One of the things that is (prima facie) remarkable about this influence of Lifton's earliest work is that the psychology of Maoism (perhaps a uniquely 20th century "religion"?) has become a template for the analysis of fundamentally different cults (some claiming ancient origins, others based on new revelations, and so on). I'm not complaining: I genuinely do see common elements to Communist and Religious authoritarianism in the 20th and 21st centuries --and their common ground may derive from the scientism of the era (and not from commonalities in the specific soteriological content of the competing doctrines). However, while Lifton's analysis may have broad applicability (and his definitions of terms, categories, etc., have provided the vocabulary for a sort of miniature social science specialized in this subject) there is an inevitable incommensurability of the things being analyzed. The loss of the individual's capacity to make decisions may be an important phenomenon in many different religious settings; however, it may also be the only thing that various cults have in common.
My ulterior motive in writing this article, I note, is that I may inspire a few of my colleagues to send me recommended reading on this subject (as I have seen very little analysis of 21st century Buddhism along these lines). As I was reading more and more of the analysis of the cult mentality, I started to sketch out what I thought were the aspects that were absent in the approaches I've seen so far (but that are significant to the world as I've known it). Obviously, this is not a rebuttal of any of the works I've been reading (and, thus, I don't cite them) but just reflects the real differences in the religions that the respective authors have been studying --and, perhaps, differences in the historical periods concerned, even if they are only separated by a few decades.
(1) One of the broadest assumptions I've noticed is that cults operate through active recruitment and conversion rather than the passive attraction of people who already agree with the cult's doctrine. The growth of communications technology has massively expanded the scope of the latter mode of operation: any group (political or religious) can now create a crossroads for people who are already like-minded or who already agree with some portion of whatever the doctrine may be. In the 1960s, such a passive approach would have hardly connected strangers within a single city or university campus; today, it can connect people across continents. The increasing ease of finding people who are already sympathetic to a cult's cause may well result in a decreasing effort to make any kind of convincing (or even coherent) account of what the doctrine is supposed to be for an audience of unsympathetic or skeptical outsiders.
As simple as it may sound, there is a world of difference between a group that presumes its members will sign up (by e-mail or telephone) to attend a conference in advance, and a group that has to actively invite strangers and skeptics to attend its rituals, lectures, and so on. The latter was a ubiquitous part of American campus life in the 1960s (everyone from the Communist Party to Satanists were actively seeking converts) but this is obviously an exhausting and frustrating experience for the true-believers themselves, and it often resulted in an asymmetry between the "public doctrine" used by the cult to recruit outsiders, and the "private reality" of what was believed and practiced amongst longstanding members. As with many other forms of marketing, we may now see the decline of the "cold call", simply because it has become easier to create a crossroads for people to meet at who have at least some interest in (or some prior agreement with) whatever the cult is offering.
The function of "conversion" may not involve convincing anyone of things they don't already believe, but merely of organizing people who are already disgruntled with other religious/political movements, and who are attracted to the cult on the basis of whatever common ground they perceive themselves to have with it (prior to contact).
(2) I find that the cult mentality is signaled by the refusal to admit of any distinction between expert and amateur opinion, and this may be of increasing significance in the 21st century context, with the increasing ease of consulting reference books, looking up scientific facts of various kinds, and so on. There was a time, simply, when it was much easier for the believers in a miracle cure to ignore the facts of medicine; those facts have become embarrassingly easy to check; ingredients with complex Latin names have become much easier to decode.
Real devotees of Communism regard economics as the "magical science" of dialectical materialism; and yet, for that same reason, they cannot cope very well with the unmagical facts of economics (that occupy so much newsprint from day to day). Either the experts must be demonized (as believers in crackpot therapies will vilify the entire medical establishment as a conspiracy against their own cult, etc.) or else the distinction between expert and amateur must disappear. The latter is a slippery slope, and it goes a long way down: it is fairly easy for someone to convince themselves that they know more about economics than the opinion pieces in the local newspaper, but, soon enough, they have come to imagine themselves as knowing more than all the most respected authors in the history of the subject --and, of course, the cult rewards and re-enforces this mentality.
Francis Bacon would remind us that we should measure knowledge in terms of "operation" (our ability to do things for ourselves, and to discover things for ourselves), rather than in terms of quoted dogma. Although the convert to Communism may be convinced that they understand the economy much more profoundly than the editors of the Financial Times, they cannot, in fact, do the math themselves. Within the cult mentality, the fact that the true believer cannot read a simple chart (nor interpret the statistics and facts offered by the detractors of the cult) are things that are easily dismissed; outside of the cult mentality, they are impossible to ignore.
I think there is an important distinction to be made between the Communist who believes merely that Communism has its own science of economics somewhere (i.e., that someone in an office in Moscow knows better than the Financial Times, or that the ghost of Lenin knows things that mere mortals cannot fully know) and the Communist who has come to regard himself (or herself) as being at no disadvantage to the experts. What I encounter, in our era, is that the believer convinces himself that he understands the only true science of economics, and refuses to recognize himself as a mere amateur (in contrast to experts of any kind, be they sympathetic or antithetical to the cult). The Christian Creationists memorize factoids about geology and physics that (supposedly) favor their case, and the Communists memorize factoids about fractional reserve banking; in both cases, the effect on the believer is not that they regard the researchers who discovered these things as extremely intelligent, but that they feel very intelligent themselves. The conceit is not that someone like Lenin knows better than the experts, but that the followers individually know better than the experts (and, thus, debunking Lenin has no effect on their faith in their own expertise). Although I won't digress into the application to Buddhism: it isn't difficult to find people in Taiwan who are convinced that a very casual acquaintance with one short text by Nāgārjuna makes them equal to any or all specialists who draw their knowledge from the prolix literature extant in Chinese, Tibetan, Sanskrit and Pali. I think this is a crucial aspect of the self-directed cult: the reliance on authority of any kind diminishes, as the follower becomes more and more convinced of their own infallibility, in their unmediated relationship with the invisible forces that they believe define the world around them.
This attitude seems to have been facilitated by the transition to the internet, as it allows people to censor their own reading more than ever before, and to selectively engage only with sources that will re-enforce their own delusions. By contrast, if you take a series of economics textbooks off of a library shelf, and glance through them to select the one you prefer, the vast majority of what you're looking at may challenge your preconceptions, and may challenge your sense of your own expertise. The latter experience is, quite rightly, humbling, and it is not humbling for the science of economics only.
(3) Another signal of cult mentality that is not new but is perhaps exacerbated by the same changes in the technology of learning is the refusal to admit distinctions of scale in social phenomena. Modes of reasoning that create categories of "everyone" and "no-one" have always existed, but the erasure of the distinction between a few hundred people and a few hundred million is made easier through the same cycle of self-selected information alluded to. It seems to me noteworthy that Western Communism in the 1960s and 1970s offered complex justifications as to why a cult-like organization of just a few hundred members could have the significance of millions, but this was still predicated on an awareness of the fact that they were a tiny minority, and that they needed to offer some justification for this fact, and for their self-perceived historical significance. This is a different thing from the delusion that the hundreds really are millions, or the delusion that a movement existing only in a few odd places actually occupies the entire world. Although some people will tell me that I'm reporting nothing new here, please keep in mind that cult members can now see short film-clips sent in by followers all over the world (via Youtube, etc.). This is a mode of auto-propaganda that did not exist a few decades ago, and a few dozen followers scattered around various continents can give the impression of a mass movement where there is really none at all.
I don't have much to say about this, except that it is a delusion that I meet "face to face" often enough. When I have spoken to true-believers in cults of various kinds, I have been able to address this type of belief through the most simple sort of Socratic method: I would simply ask about exceptions to their dogmatic assertions, and allow them to answer in their own terms. For Buddhists, in general, it is extremely painful for them to admit (or, perhaps, to realize) that the very sect they would dogmatically assert to be helping the poorest of the poor everywhere is, in fact, only seeking out a small number of the wealthiest overseas Chinese communities (in California, etc.). I recall a long (and not at all confrontational) dialogue with a true-believer of this kind, who had this realization after I asked him some very simple questions, such as "What about Haiti?" He had apparently never questioned his organization's own claims about their geographic extent (and priorities) before, and he had never reflected on the fact that they had never even made any attempt to help (or speak to) the black population within the cities of the United States they were headquartered in.
(4) Following the same pattern is the refusal to admit the sources of knowledge, which is, likewise, an aspect of cult mentality exacerbated by the fact that sources can now be checked so rapidly and easily. This, also, is very different from former generations who would openly admit that they only knew something because their grandparents explained it to them in that way, and they might be satisfied to say that they can provide no account of their grandparents' reasons for having such confidence. Perhaps the collapse of traditional authority (or the belief in tradition simply because it is tradition) is more fundamental than technological change in this case, but the ease of checking facts and comparing alternative explanations is certainly a significant catalyst.
I should clarify that by "sources of knowledge" I do not mean published works only: the refusal to admit the source of something known includes (e.g.) the refusal to admit that something was revealed in a hallucinatory experience (even if this was part of the original doctrine of the cult). It may be a refusal to admit that one doctrine was set down as a substitute for another that failed to get results, or any other disavowal of the process of discovery (magical or otherwise) that produced the thing now known.
Of course, Lifton's enduring idea of the thought-terminating cliché comes to mind in this context: routinely, some catchphrase is offered to settle the matter, so that the question of how we know what we are all supposed to know does not emerge.
However, I do think there is a generational difference in this pattern: part of the refusal to admit or examine the sources of doctrines is an embarrassment that surrounds their revelation. The problem is one of believing in a doctrine, but not believing (sufficiently) in the conditions that produced the doctrine (be it secular or sacred).
I have met people who live on a sort of religious tightrope, because they believe in the doctrines that were revealed through hallucinations, and yet they do not believe that hallucination is a reliable source of doctrines (and I think they do not trust their own hallucinatory or meditative experiences, while still wanting to believe in the given doctrine). This tension is no less true for the Communists, who may want to believe in the doctrines of dialectical materialism (as isolated tenets) but who are deeply embarrassed by the origins of those doctrines (that are neither scientifically valid, nor marked by any religious miracle to set them above the scrutiny of science), and even more embarrassed by the committees that redefined them as time went on (Karl Popper caused a great deal of this embarrassment in his books The Open Society and its Enemies and The Poverty of Historicism, precisely by drawing attention to the "sources of knowledge" that Communism relied upon).
The generational change, I think, is precisely the reluctance to propound hallucination simply as hallucination, and, as I have already said, the reluctance to rely on tradition simply as tradition. There is no longer a pervasive (cultural) sense that hallucinations couldn't possibly be deceptive, nor that one's ancestors would be inerrant in setting down principles for the next generation. Historically, the belief in things seen and heard is very much of the opposite tendency: if you're willing to believe in what people have seen, then you're willing to believe in ghosts (the refusal to believe in ghosts requires criteria that are quite a bit more complicated than validating what is merely seen). The assumption that elaborate hallucinations are a source of real knowledge is not incompatible with bare empiricism, because people really do see the things that they hallucinate; by contrast, an awareness of epilepsy as a medical condition (that has produced hallucinations in every era of history) profoundly erodes anyone's willingness to rely on such testimony, no matter how honest it may be.
(5) Bundled up with all of the above, I think there's a coeval refusal to admit the distinction between a principle and its outcome. Although some might say there's nothing new about this minor paradox, I do think this has been changing with the passage of generations: a crypto-religious group like Alcoholics Anonymous now must assert that their principles are good unto themselves (as if the principle itself were the product) precisely because there are no measurable outcomes that would validate their practice (and, on the contrary, statistics show the method to be a failure). Likewise, of course, Communists now attempt to dodge the question of the outcomes of their revolutions by arguing that (somehow) revolution itself is good in principle (and so on for a long list of other Communist principles that had terrible outcomes in actual fact, but that they can still seek to glorify in the abstract, as if the goodness of the principle itself were the only valid question).
I had a long discussion (in which I did very little talking) with a Sinhalese man who wanted to convince me that his practice of a certain form of Buddhist meditation had resulted in mosquitoes refusing to bite him. He explained the principle of the thing at length, to which I responded simply, and with a smile, "Yes, but mosquitoes still bite you." He then presented a synopsis of all the same principles, insisting on how pious and good the theory of the thing is, and that people should refrain from killing mosquitoes, with the confidence that the mosquitoes will reciprocate these sentiments. I replied by saying in almost the same words again, "Yes, but if we go into the forest now, mosquitoes will still bite you." I think the friendliness of my manner melted his resolve somewhat: he offered me a third defense of the thesis, with rather less emphasis on his former claim that the efficacy of this meditation was verified by the refusal of mosquitoes to bite him, memorably claiming "…and the mosquitoes never, ever bite me, except only sometimes…". Alas, I'll never know how different my career might have been if I had spent more time in Sri Lanka than in Laos.
Although this is a lighthearted example, the same matter extends to matters of life and death, most obviously when medical care is refused on account of such a principle.
If the question should arise as to whether or not a particular practice of meditation produces the supposed (medical or psychological) outcomes that are promised, the cult mentality is signaled by a refusal to examine what the outcomes are, to instead return to insisting on the goodness of the principle being taught. The principle itself is the product: the outcome cannot be thought of as a separate category in the mind's eye of the believer.
The generational change I would infer here is, simply, the greater accessibility of social science methods (that are, after all, a commonplace source of newspaper column fodder, even if it is in an extremely simplified form). Although not all claims can be verified so simply as sleeping in the woods without a mosquito net, there is now a broad understanding that medical claims can be measured against the outcomes for people who undergo the treatment, and even psychological "benefits" (such as those claimed by Alcoholics Anonymous) can be compared to measured outcomes of some kind. There is a sort of preclusion of skepticism in just asserting that the principle of the thing is valid unto itself, and admitting of no distinction between it and its outcome.
All five of the concepts I've sketched out have applicability to 21st century Buddhism, and are worth inquiring about in religious orders that have not reached the (headline-grabbing) extremes of Aum Shinrikyo. Although each of the five may seem rather mild if examined in isolation, they are rolled together in the cult mentality. In combination, they're an intoxicating full-time program of self-delusion.
As with the person who thinks of themselves as an expert in economics who can't interpret a simple graph, there are remarkably obvious limitations to the ability of these laypeople who now both lead and follow religious movements. I meet devotees (at all levels of education, including PhDs) who refuse to admit that anyone knows more than they do (even if they cannot read or understand the language of the sacred texts their practice is based upon, etc. etc.) while at the same time refusing to admit what their sources of sacred wisdom really are (what is text vs. revelation, and even what texts are Chinese vs. Indian in origin, etc. etc.) and also refusing to admit that their practices may be evaluated relative to their real outcomes, and so on.
Although I have seen writing about the (so-called) "secularization" of instruction in meditation, these articles have generally been interested in the alienation of the usual role of the monks, and the diminishing status of ritual (relative to meditation, lecturing, etc.). The mentality of the people who actually believe they are equal (or superior) to senior monks is worth examining in itself, along with their attitude toward scholarship as such (regardless of the monastic or secular status of the scholar). There is, I think, an important need to apply Lifton's approach to the mentality of these meditation leaders, even though the groups they are members of may never do anything violent or infamous enough to make newspaper headlines.
In my experience of speaking to such people face-to-face, their attitude is not merely one of disrespect to senior monks or scholars who may have more objective knowledge and experience than themselves --instead, it reflects a very real delusion (so to speak) that the experts do not know anything that they do not know themselves. Even in the total absence of coercion, this all adds up to a sort of sort of self-directed cult mentality. I cannot honestly say to what extent this marks a change over the attitudes of Buddhists just 200 years ago (i.e., 1812) but it seems difficult to believe that this level of chutzpah was so widespread in societies that still valued the contribution of the scribe in reproducing texts by hand, and of the monk in memorizing and reciting texts aloud, even if intellectuals interested in debating the content of the texts were as tiny a minority then as now.
Socratically enough, I have asked lay meditation leaders questions such as, "Is there really no monk alive that you respect enough to invite to speak at your meetings? Have you never met a monk who you thought was more advanced in meditation than yourself? Is there really no specialist whom you would want to consult before offering these philosophical lectures of your own invention?" The answer is that there can be no answer: any answer would admit that there's a hierarchy of knowledge, and that we only know things in reliance upon particular sources, etc. --all of which is very much antithetical to the disaggregated and self-directed ego-trip of 21st century Buddhism. I'm not insinuating that previous centuries did not have ego-trips of their own, but only that there are some features that seem to me distinctive to our own era.
I have heard (and read) devotees openly disrespecting (and insulting) the very same western scholars whose translations they rely upon as the source of their knowledge of doctrine. Although there is a bit more self-censorship when the translator is a monk, the dismissive comments are not all that much different. There is a tremendous sense of entitlement among the current generation of Buddhist meditation instructors (and even meditation followers) to dismiss anyone and everyone as not knowing what they're talking about --comparable at least in style (and certainty) to the ease with which so many people in my generation dismiss the entire corpus of knowledge published about the history of economics (in claiming that fiat currency and fractional reserve banking are some kind of conspiracy that nobody is brilliant enough to understand aside from those who denounce it a conspiracy).
The delusion of one's own infallibility is made much more powerful by my first point in the numbered list: there is less and less of an incentive to provide a coherent (or convincing) account to outsiders, because religious groups (at any scale) can passively attract like-minded people who already agree with them. I do not think I have ever met anyone alive (monk or layperson) who had ever attempted to convert a total outsider to Buddhism; at most, they had tried to convince people who were partly in agreement with their particular form of Buddhism to become fully in agreement. Whereas the Buddhism of just 100 years ago (in both East and West) was reasonably obsessed with providing an eminently rational, defensible account of the religion to outsiders (including the administrators of the British Empire in Asia) the 21st century tendency to operate the religion as a crossroads for the meeting of like-minded individuals is at the opposite extreme.
So the practitioner becomes an unquestioned authority in his or her own eyes. The reader may ask, is this really so bad? Well, one of the features of cult mentality as defined by Lifton is that it can be cured: in fact, compared to an addiction, recovery is relatively rapid, easy, and irreversible, even in cases of state-sponsored brainwashing. What I find very sad about the type of self-directed cult that I have seen myself (and that I am here, with a few sketches, describing) is that it is very difficult for anyone to become disillusioned with it: once people are convinced that they have nothing to learn from anyone else, it is very difficult for them to ever recover. The loss of curiosity is just as pernicious as the loss of liberty; no matter how gratifying the confidence of faith may seem, losing the capacity to thus ask questions, and to regard the world as full of things as-yet-unknown, is a first step on the road to losing the power to make one's own decisions --and it should be watched as carefully as the last step.