|AIDS:The Burdens of History|
source ref: ebook.html
During the past two decades Americans have participated in a series of debates about the appropriate social response to disease. At first glance the issues seem to differ widely. What were the appropriate responses to hyperactivity in children? premenstrual syndrome in women? homosexuality? drug and alcohol addiction? Were any of these in fact diseases or simply labels for socially defined deviance? What should or could have been done about John Hinckley and other possibly insane offenders? Are diagnosis-related groups an appropriate mechanism for rationalizing the costs of inpatient health care? does sickness come in neat and categorically distinct units? What are appropriate governmental and individual responses to AIDS? One could continue to add examples, but the point seems obvious. Despite their diversity, these controversies are bound together by several themes. One is the way that relationships between the medical profession and society are structured around interactions legitimated by the presumed existence of disease. A second theme is the negotiated aspect of disease as social phenomenon. A generation of social scientists and social critics has emphasized that there is no simple and necessary relationship between disease in its biological and social dimensions. Some ills have a well-understood physical basis, others none that can be demonstrated. Meaning is not necessary, but negotiated, the argument follows; disease is constructed, not discovered.
Critics have turned the delegitimating tools of cultural relativism on medicine as they have on so many other areas in which knowledge and
power are closely linked. For such scholars, Michel Foucault, not Robert Merton, has become the sociologist of choice. "I assert," a recent student of cholera and of Foucault argues, "that 'disease' does not exist. It is therefore illusory to think that one can 'develop beliefs' about it or 'respond' to it. What does exist is not disease but practices." Medical knowledge is not value-free to such skeptics, but is at least in part a socially constructed and determined belief system, a reflection of arbitrary social arrangements, social need, and the distribution of power.
The medical profession's institutional power has long been an object of reformist concern, but during the 1960s and early 1970s medicine's conceptual foundations have come under increasing attack. This relativist point of view has sought to undermine not only the apparent objectivity of particular disease entities but also, by implication, the legitimacy of the social authority wielded by the medical profession, which has traditionally articulated and administered diagnostic categories. The physician is not above social interest, but is a social actor whose mission of defining and treating disease can express and legitimate professional, class, or gender interests. This is obviously as much a political as an epistemological position. The marriage of cultural criticism and antipositivism became an influential, if never a majority, view during the past generation.
These relativist arguments are familiar and have become, in fact, a clich among social historians and social scientists. Yet it is a point of view that seems increasingly sectarian. The weight of scholarly opinion has in the past decade shifted toward an emphasis on biological factors in the understanding of disease and human behavior. We have seen this in a growing interest in the roles of heredity and constitutional factors in disease and behavior, a growing somaticism among students of mental illness. The perceived failure of deinstitutionalization has, for example, underlined the intractability and presumed biological underpinning of the psychoses. Such views are, at least in emphasis, a rejection of once-fashionable sociological formulations that tended to dismiss the diagnosis of mental illness as an exercise in the labeling of deviance.
But no single event has had a more dramatic and illuminating impact than AIDS. It has proved an occasion for labeling, but it is not simply an exercise in labeling. Gay leaders who had for decades urged the demedicalization of homosexuality now find their community anxiously attuned to the findings of virologists and immunologists. This is not to
say that the social perception of AIDS and the definition of policy choices are not shaped by preexisting social attitudes; the deviant are still stigmatized, victims still blamed. But the biomedical aspects of AIDS can hardly be ignored; it is difficult to ignore a disease with a fatality rate approaching 100 percent. AIDS has, in fact, helped create a new consensus in regard to disease, one that finds a place for both biological and social factors and emphasizes their interaction. Students of the relationships between medicine and society live in a necessarily postrelativist decade.
But as we accept our dependence on the laboratory and its findings, a number of thoughtful Americans still find it difficult to remain optimistic about society's ability to harness that knowledge; increased understanding of the natural world does not bring automatic and unalloyed benefits. We have been made too conscious of the complex and problematic relationship between medical knowledge and its application. Our decade may be increasingly postrelativist, but we are still products of a generation of relativism, conscious of the costs as well as the benefits of scientific medicine, of the provisional yet indispensable quality of medical knowledge. The meaning of disease has in the recent past become more rather than less ambiguous. It is therefore hard to embrace the clarifying simplicity of either extreme: the reductionist view that concerns itself with verifiable pathological process alone, or the uncompromising relativist position that chooses to ignore that same pathological process in shaping specific social responses.
This postrelativist ambivalence about medical knowledge is an uncertain position, one that would have made little sense to men of goodwill who sought to understand the social role of medicine in the 1930s and 1940s. This generation thought very differently about disease and the doctor's role. They shared an optimistic faith in science and medicine; superstition and social injustice had, and would, impede the accumulation and distribution of knowledge?128;but the ultimate trend was toward a more humane, healthy, and enlightened society.
No one was more prominent in that generation than the historian Henry Sigerist, a prolific author, defender of Soviet medicine, and a self-consciously irreverent gadfly of the American medical establishment. "Disease as we conceive it today," he wrote in 1943, "is a biological process. . . . Disease is no more than the sum total of abnormal reac-
tions of the organism or its parts to abnormal stimuli." It constituted a failure of the organism to adapt to its environment; disease could, that is, be socially induced, but it was not simply a social construct. It was a real pathological phenomenon. In fact, this very lack of ambiguity underlay the role of disease as a tool of social criticism; the etiology of pellagra (a disease resulting from dietary deficiency) tells us something specific about mill villages and welfare institutions. The etiology of lice-borne typhus tells the epidemiologist something very precise about cleanliness and even the price of clothing in communities with a high incidence of the disease. The persistence of typhoid in the early twentieth century constitutes a telling critique of those communities that tolerated a contaminated water supply. Medical knowledge could serve as both tool and rationale for social intervention.
Sigerist, like almost all of his contemporaries of whatever political persuasion, always maintained an enormous faith in the ultimately positive role of science in human affairs. "The more I study history," he concluded during the darkest days of World War II, "the more faith I have in the future of mankind, and the less doubt as to ultimate result of the present conflict. The step will be taken from the competitive to the cooperative society, democratically ruled on scientific principles." Science and scientific medicine were necessary aspects of the solution, not part of the problem. Such assumptions were widespread. Pioneer students of the social history of medicine, for example, tended to see as fundamental the ways in which society could stimulate, or, too frequently, impede, the autonomous and ultimately liberating development of science and medicine.
Certainly scientific ideas could be misused. The Nazis' use of a racist eugenics is an obvious example; but, as Sigerist put it, eugenics was a "socio-biological experiment that deserves to be watched carefully, even if the present Nazi regime has made it subservient to a thoroughly reactionary and unscientific politico-racial ideology." The German advocates of a racist biology were, in other words, false priests of a true religion. It seemed inconceivable to him that science would, in the long run, not stand with the forces of enlightenment and egalitarianism.
To reformers of Sigerist's generation, disease incidence was often the result of particular social arrangements, especially economic inequalities. Disease could also become part of a vicious cycle, miring families and individuals in poverty. Such ideas were widespread among advocates of what contemporaries called "social medicine," a point of view that recognized the limits of therapeutics and emphasized instead the
ways in which disease reflected environmental conditions. The preservation of health therefore often required the modification of social and economic relationships. Therapeutic intervention, according to this view, was not the answer. "Medical care cannot alone eradicate pellagra and rickets," as two leading authorities explained, citing particularly telling instances: "These conditions are for the most part diseases of poverty and ignorance," they stated, "and their prevention and cure lie with the economic and social system." "Health," they continued, "can be achieved only as a part of a high standard of living, in which good medical care is only one of a number of essential elements." It must be emphasized that their study was not a call for radical social change but a plea for the more effective and equitable distribution of medical care. The point is, of course, that the authors could not envisage a conflict between these goals. In the 1930s the fundamental problems in health care were not perceived as intrinsic to scientific medicine; instead they lay in maldistribution of the real benefits that medicine could provide. The establishment of hospitals and the provision of well-trained physicians for the poor and isolated were moral and practical necessities. And such convictions were shaped before the availability of antibiotics and the array of therapeutic and diagnostic tools that have transformed medical care in the past half-century.
Perceptions of medicine are rather different today. Despite two generations of enormous technical change, we have become aware that medical progress implies other than monetary costs. We have allowed an increasing number of men and women to live longer, yet often more incapacitated, lives. We have seen an expanded and generally more accessible medical system accused of insensitivity and physicians charged with greed and inhumanity. We have seen Sigerist's future, and in some ways it seems not to have worked. Few would-be reformers of medicine in the 1980s have been able to share his generation's confident belief in the ultimate and unambiguous benevolence of scientific medicine?128;no matter how impressive its technical achievements.
Yet as a social institution and body of ideas medicine has never been more central to American society. In the past half-century we have devoted an increasing proportion of our resources to medical care. Public expectations have increased proportionately, along with a widespread resentment at medicine's inability to comply with these imperial expectations. Malpractice suits are only one indirect index of the pervasiveness of such hopes.
Definitions of disease have come to play a particularly prominent role at the margins of medical competence where the authority of medical practitioners and medical ideas is most obviously subject to negotiation. We tend not to question the appropriateness of an orthopedic surgeon's role in treating a broken kneecap, although we might his or her exclusive role (until recently) in legitimating and controlling third-party payment for that treatment. A good many more of us would question the physician's role in defining behavioral deviance. Others would question the appropriateness of contemporary medical priorities in setting health care policy in regard to the very young, the chronically ill, and the very old. We are happy to have immunologists study AIDS; we disagree about the policy implications of their findings. Americans have, in fact, asked, or have been willing to allow, physicians to play a variety of gatekeeping as well as therapeutic roles. They have been rewarded with both power and resentment. Perhaps it was inevitable that disease-definition would become a key battleground in the debate surrounding the prerogatives of physicians and the responsibilities of government.
Ideas about the nature of disease have been fundamental both to the internal evolution of medicine and to the profession's complex interactions with society. But even if that centrality has remained consistent over time, the specific nature of those concepts and interactions has changed; Sigerist's confident view of disease as a discrete pathological process had already substantially evolved from traditional concepts.
Perhaps the most significant difference between his ideas and those of his late eighteenth- and early nineteenth-century predecessors lay in the areas of boundaries and specificity . In 1800 sickness was still viewed in largely individual terms; true, there were well-marked ills that experience had come to define as relatively specific smallpox, for example. But even in such ailments, idiosyncracy and predisposition could shape an individual's response. Most sickness was not understood in specific terms, even if its ultimate manifestations fell into accustomed patterns. Even epidemic disease was understood to result from an unbalanced state in a particular individual an imbalance resulting from the sum of interactions between an individual's constitutional endowment and the environment; thus the conventional and persistent emphases on regimen and diet in the cause and cure of sickness. It was natural for physi-
cians to assume connections between physical and psychological environment and sickness; they imposed no rigid boundaries between body and mind or between individual and environment.
It is tempting to see such systems from a functionalist point of view, to underline the ways in which this flexible explanatory system could serve both as behavioral sanction and as a basis for legitimating the physician's social role. Physicians could provide explanations for the inexplicable, reassure those still well that reason guaranteed their continued health, and at the same time reinforce society's moral assumptions. Individuals could and often did play a role in the development of their own ailments; volition and, thus, social norms explained why the drunkard, the financial speculator, and the glutton succumbed. But volition could also be used to explain the role of crowding, poor diet, and economic exploitation. The sick man was both actor and acted upon. Like an assortment of bricks, the elements of this speculative pathology could be put together in different forms according to the builder's requirements. Freethinkers could thus see enthusiastic religion as a cause of sickness, while the more evangelical could indict irreligion. The prominent role accorded to volition implied the possibility of control.
Disease ultimately expressed itself through physiological and anatomical mechanisms; but these pathological mechanisms were activated by a unique configuration of interactions between the individual and his or her environment. Significantly, however, the form of such explanations was always material and rationalistic, no matter how strained and speculative, no matter how transparently they incorporated social norms and attitudes.
Even epidemic disease could be made to fit into the same rationalistic framework, despite the obvious fact that some general factor had to be at work. The case of cholera is particularly enlightening. The most frightening and novel of nineteenth-century European and American epidemics, cholera is the closest modern analogy to AIDS. Asiatic cholera was unknown in Western Europe before 1831; it killed roughly half of those it attacked, and did so, moreover, in particularly rapid and dramatic fashion. No other pandemic had so focused popular and professional fears since plague had receded from Europe in the late seventeenth and early eighteenth centuries.
Lacking an understanding of the etiological agent, contemporaries framed a picture of cholera that sought to reduce the threat of randomness while it articulated social values and status relationships. The dirty, the gluttonous, and the poorly nourished alike were predisposed to the
disease. Predisposition was, in fact, a key term in attempts to explain this and other epidemic ailments, for it served to explain the selective exactions of what was at some level a general stimulus. Physicians played a necessary role, providing what reassurance they could in a rational, if, in retrospect, speculative, form. With no consensus regarding the pathology of the disease or the understanding of its etiology, social variables necessarily played a prominent role in fashioning a usable framework that enabled regularly trained physicians and their middleclass patients to cope with the disease. All this was soon to change. By the end of the nineteenth century, disease had become a more specific, yet at the same time more expansive, concept.
During the first thirty or so years of the nineteenth century, elite physicians began to assimilate the idea associated with the so-called Paris clinical school that disease was a specific, ordinarily lesion-based entity that reenacted itself in every individual sufferer. Lesions discernible at postmortem could be correlated with symptoms exhibited during the patient's life. Disease could also be (and often was) construed as a disturbance of physiological function that induced an anatomical lesion over time. The study of physiology could?128;some of the discipline's nineteenth-century pioneers claimed be a study of disease causation. But whether one emphasized anatomical change or physiological function, symptoms were the consequence of specific material mechanisms. Idiosyncracy was by no means banished; the predisposition to sickness, the clinical expression of a particular ailment, and the response to therapeutics were still seen in terms of an individual's constitution and personal habits. Physicians and laypersons alike instinctively preserved a role for choice and individual responsibility in explaining the selective exactions of disease.
The cause of these newly distinct entities remained a mystery, however. Some medical thinkers even contended that the ultimate cause of disease would always remain beyond human understanding; speculation could lead only to self-delusion. Degenerative or constitutional ailments might be assumed to be implicit in the design of the human body and the aging process. Acute infectious ailments, however, could not be so easily explained.
As we are all aware, an explanation was soon forthcoming. The germ theory, first plausibly articulated in the 1870s, promised to illuminate both the transmission of infectious ills and the particularity of pathological mechanisms. Thus, the evolving model of disease should be seen as having taken two linked steps in the nineteenth century: The first em-
phasized the specificity, the somatic, and mechanistic aspect of disease, the second provided a discrete cause for those changes. The legitimacy of the new style of conceptualizing disease entities was related closely to both the specificity and the tightness, or unity, of individual entities. Change was gradual, especially among laypersons. Well into the twentieth century, for example, the common cold was widely regarded and feared as the first stage of an illness culminating in tuberculosis.
The history of nineteenth-century pathology and clinical medicine seems only to underline the explanatory value of this new way of seeing diseases. Syphilis and tuberculosis, for example, so protean as clinical phenomena, gradually came to be seen as having fundamental unities based on cause and consequent pathology. Truth lay in discerning a more real (more universal and fundamental) causal reality beneath the elusive and ever-changing surface of their appearance in particular individuals. The intellectual tools for constructing an understanding of that underlying truth came increasingly from the insights and techniques of the laboratory. A minority of early twentieth-century physicians did protest the tendency toward mechanistic reductionism in diagnosis and treatment. Their successors have continued. But such warnings could not compete with the laboratory's allure; they still cannot.
Even before medicine possessed resources for treating these newly elucidated clinical phenomena, the gradual acceptance of the notion of specific disease entities by laypersons and practitioners helped reshape the physician's role underlining the importance of the technical, and increasing the gap between lay and professional medical knowledge. Early twentieth-century reforms in medical education and the standardization of hospitals were both, to an extent, responses to this emerging consensus. Sickness was now a discrete, material phenomenon, best understood by the tools of science and best treated by individuals who had mastered those tools.
But if medical knowledge was gradually becoming segregated in credentialed hands, laypersons were compensated with greater expectations and an increasing faith in medical ideas and medical experts. It was a kind of implicit contract: Society received a measure of emotional reassurance and clinical efficacy in exchange for the increased status and autonomy of medicine. Beginning in the 1880s the laboratory provided a series of dramatic insights. The discovery of the causes of cholera, tuberculosis, typhoid, and diphtheria were not esoteric events isolated in the pages of technical journals, but front-page news. And to laypersons and physicians alike, much of medicine's new explanatory power was
construed in terms of specific ills and the ability to understand, diagnose, prevent, and, in a minority of cases, treat conditions previously intractable and mysterious. Even if the demographic impact of rabies immunization and diphtheria antitoxin was minor, these treatments provided striking public evidence of medicine's new powers.
The problem, of course, with this vision of disease is not that it was wrong although in retrospect it appears incomplete and prematurely reductionist but that it was, in fact, so powerful and seductive. No group in society was more impressed than the medical profession itself. Professional status and prestige were soon recast in these new forms. Scholarship had always been important in elite medical circles. But now that scholarship had increasingly to be expressed in the form of laboratory research or systematic clinical investigation; the library and bedside no longer defined the boundaries of professional excellence. This shift in values was also effective in helping to recast the institutional shape of the medical profession, legitimating and providing content for a proliferating specialism and an increasingly self-conscious hospital and academic elite. It is true that an appropriate role remained to be defined for the so-called basic sciences in clinic and medical school; but this is irrelevant. As we are well aware, an acute-care, specific-disease-oriented approach came to characterize both the twentieth-century hospital and the career priorities of doctors. Insofar as the laboratory and basic-science disciplines were incorporated into the hospital and academic medicine, they were most frequently bent to the purpose of elucidating and monitoring pathological mechanisms.
In the last third of the nineteenth century a related, yet potentially inconsistent, development was taking place in that contested cultural terrain where society's tendency to prescribe and proscribe behavior intersected with the prerogatives of medicine. Disease boundaries were expanded to include behavior patterns that might have been dismissed as perverse or criminal in earlier generations. Most conspicuous was the way in which deviance was increasingly, if by no means universally, being defined as the consequence of a disease process and, thus, appropriately the physicians' responsibility. Toward the close of the nineteenth century, for example, neurologists widened the categories of ailments they chose to treat: Phobias, anxieties, and depression could now be classed as symptoms of neurasthenia, and alcoholism, drug addiction, and
homosexuality became potential diagnoses rather than culpable failures of volition.
What is particularly striking here is the way that the contemporary prestige of somatic models gradually redefined these behaviors as appropriately within the purview of medicine. The very fact that these novel but omnipresent "ills" manifested themselves exclusively in the form of behavior only emphasized the need to presume an underlying physical mechanism; without one, they could hardly be seen as acquired ailments or constitutional proclivities (the only presumed bases for genuine sickness).
The boundaries of medicine were expanding in the late nineteenth century and, to an articulate minority of self-consciously progressive physicians, that expansion constituted progress toward a more just and enlightened society. A growing secularism paralleled and lent emotional plausibility to this framing in medical terms of matters that had been previously construed as essentially moral. Science, not theology most physicians believed should be the arbiter of such questions.
The physician, not the priest or judge, was now viewed as the most appropriate guardian of the rights of society and the individual. The sufferer from phobias and anxieties, the victim of sexual incapacity, the man or woman consumed with desire for a socially unacceptable love object could be seen as the product of his or her material condition rather than as an outcast. By no means all contemporaries accepted such views. But to the stigmatized themselves these hypothetical diagnoses may well have been palatable; given the choice, an individual might well prefer to regard his or her deviant behavior as the product of hereditary endowment or disease process. It might well have offered more comfort than the traditional option of seeing oneself as a reprehensible and culpable actor. The secular rationalism so prevalent in the late nineteenth century freed many Americans from a measure of personal guilt at the cost of being labeled as sick. Not until the second half of the twentieth century, however, has this come to seem a problematic bargain.
Late nineteenth-century medical practitioners became active in another area, apparently reflecting the laudable and inexorable expansion of medical responsibility. This new area was public health and, in particular, the shaping of an interventionist social agenda. These reformist and environmentally oriented policy guidelines seemed no more than appropriate responses to the findings of contemporary epidemiology. Sickness was repeatedly connected with poverty and deprivation. The
conclusions seemed obvious to reformers. An enlightened society should purify its water, provide pure milk for its children, inspect its food, and clean its streets and tenements. The expansion of public medicine was connected in a score of ways with the style of self-consciously and self-righteously enlightened government we have come to associate with progressive reform. Moreover, there appeared to be no inherent conflicts among the expansion of medical authority, the clothing of that authority in the guise of scientific reductionism, the proliferation of disease entities?128;and the vision of a good society. In fact, this confluence of factors seemed necessary and necessarily benevolent. This optimistic and activist tradition still informed the assumptions and hopes of most advocates of social medicine in the 1930s and 1940s.
In the past two decades, however, this configuration of views has appeared to many social critics as neither necessary nor unambiguously benevolent. Medicine has been confronted with a multisided crisis in public expectation. Even those Americans least critical in their attitude toward the benefits of continued medical progress are concerned about the monetary cost. Others who are more skeptical, but still willing to concede the real equities of contemporary medical practice, deplore the ethical and human costs of bureaucratic, episodic, high-technology care. Again and again these concerns focus on the definition of disease.
The first widely expressed concern arose in regard to mental illness; it constituted what I have called elsewhere a "crisis in psychiatric legitimacy." It might with equal justice have been termed a crisis in the cognitive and administrative management of deviance. Beginning in the early 1960s sociologists and social critics began to emphasize the arbitrariness of psychiatric categories and to contend that they were in essence labels, culturally appropriate ways of stigmatizing deviance. Psychiatric thought was in good measure a mechanism for framing, and thus controlling, deviant behavior. The force of this radical critique was underlined by a nagging truth.
Medicine had already come to play a prominent role in relation to just those areas such as sexual deviance, addiction, and even criminality where supposedly pathological behaviors fit least comfortably into the pathological model that has explained and legitimated conventional categories of somatic illness. Psychiatry still lacks a mechanism-specific understanding of the great majority of the syndromes it treats.
A dramatic tension thus remains between psychiatry's cognitive legitimacy and clinical responsibilities. Nor is it an accident that the specialty fits uneasily into medicine's status hierarchy. The recent expansion of interest in somatic approaches to psychiatric ills demonstrates these inconsistencies as much as it does the accumulation of new knowledge and new techniques.
A second area of disease-related conflict has turned around the dominance of acute, interventionist models in medical-career priorities and institutions. The prestige of medicine and the personal health expectations of Americans have increasingly come to turn on the efficacy of scientific, interventionist medicine a system of values and expectations that has been built into the economic as well as intellectual basis of American health care in the past half century. Yet it is a system that is widely perceived as having failed to provide adequate care for the old and chronically ill, or even humane death for the moribund.
Third-party, employer-based insurance has also been structured around the hospital and explicit disease entities. So have federal health insurance schemes. Disease has served as a moral and logical rationale for these bureaucratic reimbursement systems even though payments correspond to days of hospitalization, physician visits, or particular procedures. Specific disease entities have come to mediate between the conceptual world of medicine and the expectations of laypersons. Interactions between doctor and patient ordinarily take place in units defined and bureaucratically justified by the existence of real or presumed sickness. Health insurance has provided a measure of care and emotional security for millions of Americans and a steady flow of income to hospitals and hospital suppliers. But the levers controlling that cash flow can only be pressed by physicians. The language of diagnostic categories at once helps to expedite and to legitimate this special relationship among physicians, patients, and health insurers. Physicians in the mid-1980s complain of the growing influence of cost accounting and bureaucracy and their decreasing role in making care decisions. Diagnosis-related groups seem an obvious justification for such fears. Yet these diagnostic categories are product and symbol of, and condign punishment for, the rigid and unresponsive aspects of our cost-plus, disease-legitimated system of third-party payment. It is a system, moreover, in which physicians and the values of scientific medicine have played a pivotal role.
Rising costs have helped remind us that sickness as experienced comes in units of people and families?128;and not of discrete, codable diagnostic entities. It is significant that socially minded physicians throughout the
first half of this century repeatedly cautioned that patients had families, that managing an acute episode of sickness or trauma did not exhaust the possible universe of medical care options. As early as the 1920s a minority of clinicians warned that chronic and geriatric problems would become increasingly significant as the incidence of acute infectious ills declined; they warned as well that episodic, hospital-based treatment was inadequate for the optimum care of such ailments. Few contemporaries bothered to disagree, yet such concerns became, in fact, increasingly marginal to the actual work routine of many physicians especially the specialized and often research-oriented academic elite.
A third kind of conflict grew out of the success of medicine itself in helping banish the randomness of acute infectious illness from the perceived life chances of most Americans. The great majority of our children live to adulthood. We enjoy a greater confidence in predicting our future, but at the cost of granting enormous social power to medical practitioners and institutions. It was in some ways a mutually advantageous contract like that between the psychiatrist and the depressed or deviant patient. But even the most dramatic and undeniable achievements of medicine have their social costs.
One such cost lies in the growing problem of chronic and degenerative ills. Another lies in our cultural habit of dealing with a diversity of elusive social problems by reducing them to technical terms holding out the promise of neat solutions. Even the most dramatic technical achievements may simply redefine problems, not solve them; or they may create new difficulties in the process of solving old ones. The neonatal intensive care unit is a case in point; so are renal dialysis and cardiac transplants. The elusive phrase, "quality of life," has become increasingly familiar in the past decade. It is hardly an accident.
As the economic and emotional stakes increase, so does the likelihood of conflict. The social meanings of disease have become increasingly the subject of debate and negotiation. Matters of cost are in some ways simple enough. Questions of value can be even more evasive. Is the prevention of sickle-cell anemia through genetic counseling a blow for equal rights or an opportunity for masked genocide? Does a collective social interest require that individuals be forced to use seatbelts? Does calling premenstrual syndrome a disease liberate or enslave women? Does the imposition of mandatory maternity leave constitute justice or handicap women in the economic marketplace? Things were much simpler for the majority of reformers in progressive America. The control of women's hours and conditions of labor seemed to them an un-
ambiguous social good, and woman's role seemed ultimately and unambiguously domestic.
In still another area, dominance of the disease entity has left the profession ill-prepared to address other medical problems that are not as easily construed in such terms. This is certainly one reason for the comparative lack of interest in geriatrics, chronic care, and maternal and child health. The old and chronically ill cannot except episodically be seen as sufferers from discrete and meliorable ills. Neither conceptually nor actuarially do they fit comfortably into contemporary practice patterns. The monitoring of particular organs, or intervention in acute episodes have already become the responsibility of one specialty or another; the patient constitutes a residual category. Similarly, victory over the most important and accessible causes of infant and early childhood mortality has left the profession little concerned with the "lingering" aspects of the problem, which are politically sensitive and not easily amenable to exclusively technical solutions. It is clear, for example, that the neonatal intensive care unit is not an all-sufficient answer to the problem of low weight and prematurity, but it is a more congenial and prestigious approach, and seemingly less elusive than the economic and political measures that are its natural counterparts. Similarly, the laboratory response to AIDS has been better funded and more focused than logically parallel efforts in the sphere of education and prevention.
The status of the medical profession, like the meaning of disease, has in the past decade become more rather than less ambiguous. As the technological capabilities of medicine become ever more dramatic, as we transplant hearts and fertilize ova in vitro, we have seen the parallel growth of skepticism and even hostility among laypersons. Such ambivalence is in fact an important component of attitudes toward medicine, technology, and the bureaucracies that embody and administer medical care. At the same time, we have by no means banished disease, even if we have altered the forms in which it is most likely to become a part of our lives. We still have to construct frameworks of understanding and reassurance within which we make sense of its inevitable exactions. Scientific medicine provides a fundamental, and to many individuals well-nigh exclusive, element in shaping that understanding even in those ailments for which no effective treatment is available.
For many Americans, the meaning of disease is the mechanism that defines it; even in cancer the meaning is often that we do not yet know the mechanism. To some, however, the meaning of cancer may tran-
scend the mechanism and the ultimate ability of medicine to understand it. For such individuals the meaning of cancer may lie in the evils of capitalism, of unhindered technical progress, or perhaps in failures of individual will. We live in a complex and fragmented world and create a variety of frameworks for our manifold ailments. But two elements remain fundamental: one is a faith in medicine's existing or potential insights, another is personal accountability.
The desire to explain sickness and death in terms of volition of acts done or left undone is ancient and powerful. The threat of disease provides a compelling occasion to find prospective reassurance in aspects of behavior subject to individual control. Mental illness was, for example, commonly explained in the past as a possible consequence of habit patterns gradually hardened into uncontrollable pathologies. Those who avoided even occasional lapses would have little to fear. In the nineteenth-century epidemics of cholera, as we have seen, there was much talk of predisposition. The victims' behavior or place of residence explained why they, in particular, succumbed to a general epidemic influence. With decreasing fear of acute infectious disease in the mid-twentieth century, Americans have turned increasingly to a positive concern with regimen to diet and exercise as they seek to reduce their real or sensed risk, to redefine the mortal odds that face them. The other side of the coin is a tendency to explain the vulnerability of others in terms of their acts overeating, alcoholism, sexual promiscuity.
It is into this world that AIDS arrived almost as novel and frightening a stranger as cholera a century and a half ago. We were not entirely prepared. Antibiotics had removed much of the fear traditionally associated with acute infectious ills. Most laypersons have come to assume that such afflictions had succumbed to the laboratory's insights. Children no longer died of diphtheria; plague and cholera no longer killed masses of men and women. Tuberculosis, too, had declined, along with typhoid and other water-borne diseases. Penicillin had robbed syphilis of much of the fear that had so long surrounded it. The age of great and intractable epidemics seemed to have passed, and most laypersons assume?128;whether accurately or not that medical therapeutics deserved the credit.
But AIDS is both mortal and intractable. It provokes memories of the
fear that helped create cautionary and reassuring explanations for plague or cholera in earlier centuries. An ailment that combines sexual transmission with a terrifyingly high mortality, AIDS was bound to attract extraordinary social concern (in clear contrast with a more shallow and transitory social response to herpes; despite the media attention showered abruptly on herpes, it could not mobilize the same level of social concern). It reminds us of the way society has always framed illness, finding reasons to exempt and reassure in its agreed-upon etiologies. But it also reminds us that biological mechanisms define and constrain social response. Ironically, this new disease reflects both elements the biological and cultural in particularly stark form. Only the sophisticated tools of modern virology and immunology have allowed it to be defined as a clinical entity; yet its presumed mode of transmission and extraordinary fatality levels have mobilized deeply felt social attitudes that relate only tangentially to the virologist's understanding of the syndrome. If diseases can be seen as occupying points along a spectrum, ranging from those most firmly based in a verifiable pathological mechanism, to those, like hysteria or alcoholism, with no well-understood mechanism but with a highly charged social profile then AIDS occupies a place at both ends of that spectrum.
The social response to AIDS also reminds us that we live in a fragmented society. To a substantial minority of Americans, the meaning of AIDS is reflected in, but transcends, its assumed mode of transmission. It was, that is, a deserved punishment for the sexual transgressor; the unchecked growth of deviance was a symptom of a more fundamental social disorder. "Where did these germs come from?" a writer to an urban newspaper asked in the fall of 1985. "After all this time, why did they show up now? . . . God is telling us to halt our promiscuity. God makes the germs, and he also makes the cures. He will let us find the cure when we straighten out." It is significant that this same correspondent felt compelled to add that he was not "a religious fanatic," for the great majority of Americans accept the authority of medicine and the reality of its agreed-upon knowledge. They look to the National Institutes of Health, not to the Bible, for ultimate deliverance from AIDS.
The meaning of scientific knowledge is determined by its consumers. When certain immunologists suggest that predisposition to AIDS may grow out of successive onslaughts on the immune system, it may or may not prove to be an accurate description of the natural world. But to many ordinary Americans (and perhaps a good many medical scientists as well) the meaning of such a hypothesis lies in another frame of refer-
ence. As was the case with cholera a century and a half before, the emphasis on repeated infections explains how a person with AIDS had "predisposed" him or herself. The meaning lies in behavior uncontrolled. When an epidemiologist notes that the incidence of AIDS correlates with numbers of sexual contacts, he may be speaking in terms of likelihoods; to many of his fellow Americans he is speaking of guilt and deserved punishment.
Of course, it was to have been expected that patients who contracted AIDS through blood transfusions or in utero are casually referred to in news reports as innocent or accidental victims of a nemesis both morally and epidemiologically appropriate to a rather different group. The very concept of infection is and always has been highly charged; enlightened physicians have always found it difficult to make laypersons accept their reassurances that particular epidemic ills might not be infectious. The fear of contamination far antedates the germ theory which in some ways only provided a mechanism to justify these ancient fears in modern terms. It is hardly surprising that many remain unconvinced by authoritative medical assurances that AIDS is not (or is not very) contagious.
Knowledge needs to be understood within highly specific contexts. And the specific content of that knowledge itself needs to be seen as a social variable. AIDS underlines the inadequacy of an approach to understanding and controlling disease that ends at the laboratory's door. But it also emphasizes the parallel inadequacy of disregarding the specific biological character of an ailment and the status of our understanding of that character.
Our experience with AIDS emphasizes this commonsense point. As our knowledge of the syndrome changes, so do choices and perceptions. Aspects of our culture as diverse as insurance, civil rights, education, and policy toward drug addiction have all been illuminated by our increasingly circumstantial knowledge of AIDS as a biological phenomenon. Knowledge may be provisional, but its successive revisions are no less important for that. With each revision, the structure of choices for individuals and society changes. Without a serological test for exposure to AIDS, for example, there would be no debate about screening, access to insurance, and civil rights (not to mention the dilemma of millions of individuals who seek to define their own risks and predict an unpredictable future).
There are some morals here. Perhaps we cannot return to the optimistic faith so general in the 1930s and 1940s; we are too much aware
of the costs. But we can share the fundamental understanding of the need to study the interactions between society and medicine if we are to bring the benefits of medicine to the greatest number. We are products of what might be termed a generational dialectic. Most students of the social aspects and applications of medicine cannot easily return to the optimistic faith of the 1940s. But our very wariness, our need to place medical knowledge in a cost-benefit as well as cultural context, underlines an important agenda for social medicine. If the recognition of disease implies both a phenomenon and its social perception, it also involves policy. And that policy inevitably reflects phenomenon and perception. If an ailment is socially defined as real, and nothing is done, then that, too, is a policy decision. This process of interaction between phenomenon, perception, and policy is important not only to medicine but also to social science generally. The brief history of AIDS illustrates both our continuing dependence on medicine for better or worse and the way that disease necessarily reflects and lays bare every aspect of the culture in which it occurs.
An earlier version of this paper appeared in The Milbank Quarterly 64, suppl. 1 (1986): 34-55. I should like to thank Barbara Bates, Renee Fox, Stephen Kunitz, Dorothy Nelkin, Rosemary Stevens, and Owsei Temkin, and the editors of The Milbank Quarterly supplement for their helpful comments as well as audiences at Cornell, Harvard, Johns Hopkins, and Columbia universities, who tolerated and criticized earlier versions.
1. "Presumed," because disease does not exist as a social phenomenon until it is somehow perceived as existing. This perception can have any one of many relationships to a possible biological substrate. [BACK]
2. For a useful?128;if eclectic?128;collection of case studies reflecting this point of view, see Peter Wright and Andrew Treacher, eds., The Problem of Medical Knowledge : Examining the Social Construction of Medicine (Edinburgh: University of Edinburgh Press, 1982). The sociological literature of the 1960s and 1970s on the "social construction" of mental illness was particularly influential in questioning the value-free, positivist conception of sickness categories. [BACK]
3. François Delaporte, Disease and Civilization : The Cholera in Paris , 1832 , tr. Arthur Goldhammer (Cambridge: MIT Press, 1986), 6. [BACK]
4. For an analysis of the demedicalization movement, see Ronald Bayer, Homosexuality and American Psychiatry : The Politics of Diagnosis (New York: Basic Books, 1981). [BACK]
5. Henry Sigerist, Civilization and Disease (Ithaca: Cornell University Press, 1943), 1.
6. Ibid., 244. [BACK]
5. Henry Sigerist, Civilization and Disease (Ithaca: Cornell University Press, 1943), 1.
6. Ibid., 244. [BACK]
7. See, for example, the work of Bernhard Stern, Social Factors in Medical Progress (New York: Columbia University Press, 1927); Society and Medical Progress (Princeton: Princeton University Press, 1941), esp. chs. 8-10; and Richard H. Shryock, The Development of Modern Medicine (Philadelphia: University of Pennsylvania Press, 1936). [BACK]
8. Sigerist, Civilization and Disease , 85. [BACK]
9. See, for example, George Rosen, "What Is Social Medicine?" Bulletin of the History of Medicine 21 (1947): 674-733; René Sand, Health and Human Progress : An Essay in Sociological Medicine (New York: Macmillan, 1936) and The Advance to Social Medicine (London: Staples, 1952). [BACK]
10. R. I. Lee and L. W. Jones, The Fundamentals of Good Medical Care , Publications of the Committee on the Costs of Medical Care, no. 22 (Chicago: University of Chicago Press, 1933), 15. State hospital reform in this period provides another parallel. Albert Deutsch's widely read exposés, for example, constituted a plea for the renovation and medicalization of these neglected institutions?128;but the therapeutic options then available inspire no great confidence today (Deutsch, The Shame of the States [New York: Harcourt, Brace, 1948]). [BACK]
11. Charles E. Rosenberg, "The Therapeutic Revolution: Medicine, Meaning and Social Change in Nineteenth-Century America," Perspectives in Biology and Medicine 20 (1977): 485-506. [BACK]
12. The analogy is obviously not exact. So far as we are aware, clinically identifiable cases of AIDS have a mortality rate of nearly 100 percent?128;but over a clinical course that is far more extended than that of cholera. [BACK]
13. Etiological speculations reflected and rationalized real or potential social conflict in particular national contexts. See, for example, Charles E. Rosenberg, The Cholera Years : The United States in 1832 , 1849 , and 1866 (1962; rev. ed. Chicago: University of Chicago, 1987); Roderick E. McGrew, Russia and the Cholera , 1823-1832 (Madison: University of Wisconsin Press, 1965); R. J. Morris, Cholera 1832 : The Social Response to an Epidemic (New York: Holmes and Meier, 1976); Michael Durey, The Return of the Plague : British Society and the Cholera 1831-32 (Dublin: Gill and MacMillan, 1979). [BACK]
14. Owsei Temkin, "The Scientific Approach to Disease: Specific Entity and Individual Sickness," in Scientific Change : Historical Studies in the Intellectual , Social and Technical Conditions for Scientific Discovery and Technical Invention from Antiquity to the Present , ed. A. C. Crombie (New York: Basic Books, 1963), 629-647. [BACK]
15. For an influential if rather categorical statement of this point of view, see N. D. Jewson, "The Disappearance of the Sick-Man from Medical Cosmology, 1770-1870," Sociology 10 (1976): 224-244. See also Rosenberg, "Therapeutic Revolution." [BACK]
16. A significant antireductionist tradition has always existed among clinicians. For a recent statement of this continuing tradition, see Richard J. Baron, "An Introduction to Medical Phenomenology: I Can't Hear You While I'm Listening," Annals of Internal Medicine 103 (1985): 606-611. [BACK]
17. The psychodynamic models of behavioral disorder so influential in the first half of the twentieth century shared the determinism of their somatic forerunners, although differing in etiological emphasis. Dynamic psychiatry,
however, remained a minority and in some ways atypical aspect of American medicine?128;even when it loomed prominently in the world view of educated laypersons. In any case, the areas of its greatest clinical responsibilities were ones that had already been claimed for medicine before 1900. [BACK]
18. Charles E. Rosenberg, "The Crisis in Psychiatric Legitimacy: Reflections on Psychiatry, Medicine, and Public Policy," in American Psychiatry : Past , Present , and Future , ed. G. Kriegman, R. D. Gardner, and D. W. Abse (Charlottesville: University of Virginia Press, 1975), 135-148. [BACK]
19. See, for example, H. B. Richardson's pointedly titled Patients Have Families (New York: Commonwealth Fund, 1945). [BACK]
20. The most recent history of venereal disease in twentieth-century America emphasizes a continuity in social attitudes that rendered sexually transmitted ills impervious to eradication campaigns (Allan M. Brandt, No Magic Bullet : A Social History of Venereal Disease in the United States Since 1880 , expanded ed. (New York: Oxford University Press, 1985). [BACK]
21. Charles Realdine, letter to the editor, Philadelphia Daily News , 31 October 1985. [BACK]
22. During the first nineteenth-century cholera pandemic in the early 1830s, ironically, laypersons also tended to dismiss advanced medical opinion that reassured them the disease was not contagious. [BACK]