Skip to main content
Free

Autism, “Stigma,” Disability A Shifting Historical Terrain

Abstract

Erving Goffman’s 1963 foundational discussion of stigma has been both embraced and critiqued in disability studies and other fields. In Goffman’s interactional and ahistorical analysis, stigma was presumed to exist as a natural feature of humanity, deflecting attention away from historical analysis. This article, in contrast, argues that stigma—particularly surrounding “mental illness”—is deeply embedded in historically contingent structural conditions of modern capitalism and ideologies of individualism that shape ideals of the modern worker. Specifically, I use the case of autism—and its commodification in the United States—to show how a stigmatized “mental illness” is intertwined with a range of financial interests that come to depend on the continued production of certain diagnoses. For example, an analysis of the “autism industrial complex” in the United States reveals how economic changes set the conditions for a range of practices that promise to reduce stigma; these include special education, activism/advocacy, and self-representation. These occur in the context of a transition toward more flexible employment and the increasing value of technological and artistic skills often associated with neurodiversity. Despite the fact that a capitalist logic continues to define valuations of personhood, families and autistic self-advocates have been empowered in recent years to use a variety of strategies to decouple stigma and illness and resist conventional definitions of autism as a syndrome of deficits.

In the time since autism was first identified as a “mental illness,” this diagnostic category has undergone remarkable changes. Once considered exceedingly rare and profoundly debilitating, it is now relatively common; once highly stigmatized, it is increasingly accepted under the banner of neurodiversity, invented and promulgated by autistic self-advocates in the United States, many of whom identify as part of the American disability rights movement. Indeed, one reason autistic self-advocates chose to represent themselves through the term “neurodiversity” was to claim ownership of and redefine the currently powerful brain-based model. The claiming of a new identity term—“neurodiversity” and its counterpart “neurotypical”—stands as a strategy to disrupt the stigma long associated with “autism-as-mental-illness.” Assigning this diagnosis a positive social value resembles the strategy of queer, crip, and fat theorists who subverted and disidentified with normative categories and definitions that have subjected them all to stigma for many decades. This article tracks the history of stigma, autism, and “mental illness,” arguing that we cannot understand the emergence of these personhood-shaping categories apart from their long-standing imbrication with the transforming political economy of capitalism and its ideologies of labor.

The Intertwined Histories of Stigma and “Mental Illness”

Stigma is the unwanted shadow of a person, produced when society disdains certain human differences, retaining its ancient Greek meaning as a mark or branding on the body made with a sharp instrument. Often associated in its plural form (stigmata) with Christ’s crucifixion wounds, it has also come to connote a flawed psychological or physical state. Stigmatized people are often seen as incompetent, blamed for their suffering, and socially marginalized in ways that we might now consider “ableist.”

Experts and advocates decry stigma’s persistence. Former directors of the National Institute of Mental Health (NIMH), Steven Hyman and Thomas Insel, have repeatedly called stigma an international “public health crisis” (Insel, Collins, and Hyman 2015). Surgeons general have continually declared war on stigma. According to the US Department of Health and Human Services, stigma is “the most formidable obstacle to future progress in the area of mental illness and health” (DHHS 1999, 2001). However, such advocates and institutions rarely define stigma, identify its causes, or suggest ways to reduce it beyond improving mental health awareness, education, and treatment.

The scholarly literature on stigma—mainly in the fields of psychology and sociology—draws its inspiration from Erving Goffman’s classic work Stigma: Notes on the Management of Spoiled Identity (1963), which conceptualized stigma as interactional and performative. Written before identity politics, intersectionality, and the social model of disability were available constructs, Goffman’s ahistorical analysis focused on individuals living in contexts where stigma’s existence is presumed and must be managed. He placed the burden of management on discredited individuals who need to hide or mitigate the public exposure of their stigmatizing conditions. Almost all of us, he says, will at some point in time be devalued, if not because we have some discrediting attribute, then because we have social connections to someone who does.1 Due to Goffman’s influence, academic works on stigma deflected attention away from historical and cross-cultural analysis, instead refining a typology of the concept and describing the negative effects of stigma and the cognitive processes involved with classification and stereotyping. Moreover, historically informed sociological analyses tend to emphasize how stigma motivates historical change, as opposed to how historical processes produce particular forms of stigma (see, e.g., Major, Dovidio, and Link 2018).

Yet history tells us that stigma—a culturally specific concept—is highly variable across time and place (Tyler and Slater 2018). It does not derive solely from ignorance or an individual’s failure to navigate the psychological machinations of the presentation of self in everyday life. As Brendan Gleeson puts it, Goffman’s “interactionist fallacy” masks the structural forces that underwrite personal encounters and their meanings (Gleeson 1999:17; Schweik 2014). From its inception, stigma has been bound up with ideas about “mental illness” in Europe and North America; I argue that their two histories can be told as one. Mental illnesses became stigmatized as this label was increasingly deployed as a modern category for the idle, particularly as capitalism developed. Doctors, politicians, and other “experts” on public health isolated people they deemed economically unproductive. Over the past three centuries, neither awareness nor medical and scientific advances have greatly affected the ebb and flow of the stigma of the many conditions classified under the rubric of mental illness, whether explained via conventional psychosocial and psychoanalytic frames or the more recent neurobiological models (see, e.g., Angermeyer and Matschinger 2005; Pescosolido et al. 2010; Reed et al. 2016). Just as ignorance is not wholly to blame for stigma, neither does scientific knowledge erase it. Stigma comes from deep structural conditions, such as capitalism, ideologies of individualism and personal responsibility, and the complicated legacies of racism and colonialism. Our dynamic conceptions of mental illness ride on the waves of broader cultural changes, and when science or medicine does ameliorate the shame of suffering, it does so as the servant of culture.

In this article I examine the dynamics of stigma through the tectonic shifts of economic and political structures and accompanying ideologies of exclusion and inclusion. I argue that stigma emerged out of the structural conditions shaping capitalism, including ideologies of individualism, personal responsibility, and the complicated legacies of colonialism. Stigma must be challenged in the context of those conditions, as the recent and successful efforts of autism activists and autistic self-advocates demonstrate. This is not to suggest that socialist states or noncapitalist communities do not stigmatize difference. The idea that a person should be valued for individual productivity was already in place in communities that later embraced other forms of governance.

Moral judgments about “mental illnesses” reflect what, at certain times and places, people consider the ideal society and person. The same holds true for physical disabilities when communities perceive them as violations of a properly ordered life (Murphy 1987:29) or as “nature’s mistakes” (Bogdan 1988:6). The most stigmatized people tend to be those who do not conform to the ideal modern worker: the autonomous, self-reliant, individual. Indeed, I argue that the burden of stigma changes along with the ideals of the modern worker. At the same time, neurodiversity, which was explicitly modeled on the social model of disability (Shakespeare 2010), has become an exemplar for how economics can play a role in destigmatizing a previously highly stigmatized condition. New accommodations and greater accessibility in work and community life have helped many who identify as neurodiverse make claims on inclusion (Silberman 2015:472–473). As some workplaces become more flexible, valued twenty-first-century workers—including those with autism—might be self-employed, work part-time, combine paid work with family care or volunteerism, interact virtually rather than in person, and continue to live with their parents after the arbitrary ages of adulthood, such as 18 or 21 years of age. As Rapp and Ginsburg (2011) have shown, the “difference of disability” reverberates over the life course, creating “new kinship imaginaries” as many families learn to accommodate atypical lives. Popular representations of contemporary autistic imaginaries suggest that young adults with autism can succeed in the workplace not despite their differences, such as restricted interests in technology and numbers, but because of them. They might enjoy repetitive administrative and technical tasks that neurotypical others eschew, such as filing, inventory management, and animal care. Such flexibility in assessing social and economic worth has made it possible for people like my daughter Isabel, who self-identifies as autistic, to celebrate forms of difference that were once disdained and hidden; they can become valued and visible parts of economic and community life to a degree that was previously impossible.

An increasing number of people with disabilities are being offered more accommodations and job support as alternative work schedules become available to a growing proportion of workers in the United States, the United Kingdom, and most G20 countries (Meager and Higgins 2011). The Organisation for Economic Co-operation and Development (OECD) predicts that “such increased flexibility will provide greater opportunities for underrepresented groups to participate in the labour market, such as women, senior workers and those with disabilities” (OECD 2017). The move from factory manufacturing toward flexible production and the democratization of communication via social media and other digital forms have opened up a degree of community integration that was formerly inaccessible for them. Despite those advantages, many scholars and policy makers consider such flexible work to be potentially exploitative of workers through “contingent work,” given that part-time employees are often denied full benefits (Barker and Christensen 1998; Belous 1998; Thomason, Burton, and Hyatt 1998). Yet, more flexible kinds of work schedules make it possible for people with physical and mental disabilities to work, resisting norms that prevent them from becoming integral parts of a community.

Research on the trajectories of autistic adults, and employment opportunities in particular, lag behind research on children and special education (Wehman et al. 2014). Nonetheless, there is a growing body of literature that suggests that people with autism, while continuing to face discrimination, are capable of succeeding in competitive inclusive employment (Wehman et al. 2013), especially if they had work experience as teenagers (Siperstein, Heyman, and Stokes 2014). Workplace changes may valorize those who would previously have been denigrated, like the person on the autism spectrum with a talent for high technology, more comfortable working and interacting with others online. Autistic adults sometimes excel in areas of significant job growth, such as engineering and other professions that rely heavily on mathematics. The Kessler Foundation’s reports on National Trends in Disability Employment (nTIDE) show increasingly positive trends for employment of people with autism,2 although the number of people with cognitive disabilities employed full- or part-time in the United States has decreased over the past three decades, even after passage of the 1990 American with Disabilities Act (ADA).3 For autism, even where employment rates remain stagnant, salaries are improving (Hendricks 2010; see also Feinstein 2018).

Psychiatric disorders and developmental disabilities have become increasingly normalized over the past several decades, a stunning reversal of a shameful and stigmatized history. In 1944, for example, one of the most celebrated twentieth-century psychologists, Erik Erickson, sent his infant son Neil, born with Down syndrome, to a residential institution and told everyone, including his other children, that the baby had died at birth (Friedman 1999). In the 1960s and 1970s, children with autism were often diagnosed with childhood schizophrenia or mental retardation, and schools and employers offered few opportunities. With no evidence to back up their accusations, clinicians commonly blamed autism on supposedly unloving “refrigerator mothers” (Bettelheim 1972) and conceived of autism in the framework of psychotic disorders (Tustin 1995). In that context, few parents wanted to disclose that they had a child with autism (Grinker 2007); growing up, I knew no one with autism. But in the twenty-first century, many parents discuss their children’s cognitive, emotional, and behavioral diagnoses, medications and doses, and battles with their school systems, and might even compare them to Bill Gates, Isaac Newton, and Vincent Van Gogh, all mythologized as possibly autistic. Given this dispensation, most people I meet in my social and professional orbit today personally know several people with autism. We are witnessing what Rapp and Ginsburg describe as “the expanding arena of public intimacy around the experience of disability” (2011:395).

This more flexible sense of personhood or citizenship is apparent in almost every aspect of our existence—in the fluidity of our ethnic, gender, and racial classifications, for example—along with “new kinship imaginaries,” in which “atypicality is the norm” (Rapp and Ginsburg 2011:406). Our identities are becoming more fluid and negotiable, and this plasticity goes well beyond recognizing complex multiethnic, multiracial personhood. Many forces are contributing to these shifts. For example, globalization encourages forms of personhood consonant with borderless markets and claims for universal human rights, transcending any particular nation or government (Lewellen 2002). Other influential developments include the acceptance of transgendered identities and nonbinary sexual preferences, religious or spiritual fluidity, and new kinds of sociality that transcend the division of species, as with companion animals such as therapy dogs (Solomon 2010). In short, diverse forms of personhood are generally more accepted and considered less threatening today, despite the persistence and resurgence of bigotry, racism, anti-Semitism, and discriminatory immigration reforms throughout the world.

In this context, mental health fields are shifting from contained diagnostic categories, in which one has this disease or does not, to a dimensional view; nearly all disorders are now considered to be distributed differentially across the population. As a result, researchers, including the authors of the Diagnostic and Statistical Manual of Mental Disorders, fifth edition (DSM-5), have reframed major diagnoses as spectrum disorders (e.g., schizophrenia spectrum disorders, bipolar spectrum disorder, and the depressive spectrum), as happened with autism more than a decade ago. In this new model, clinicians ideally pay more attention to describing the severity of a patient’s various symptoms than to assessing whether a patient meets every criterion for a specific disorder.

The stigma of “mental illness” is modestly decreasing across many locations, although this is uneven at best. In my field sites in Africa, Asia, and North America, medicalized psychiatric conditions as well as post-traumatic stress disorder among survivors of war, natural disasters, and sexual violence are increasingly accepted as distributed throughout the entire population (Breslau 2004; Grinker and Cho 2013; Grinker et al. 2012). Regardless of these steps toward fluidity and flexibility, in some parts of the world people with cognitive disabilities are imprisoned in their own homes or in state institutions. Numerous conditions have lost stigma, but other disorders such as schizophrenia and addictions remain in stigma’s shadow. In the United States, Alcoholics Anonymous remains anonymous. The very phrase “mental health” is designed to avoid the connotations of sickness. Even the National Institute of Mental Health, the leading federal agency for research on mental illness, does not call itself an institute of mental illness, though the other national institutes are named for diseases (e.g., the National Cancer Institute).

Nonetheless, greater recognition of developmental disorders, even in low- and middle-income countries, has catalyzed early intervention programs and special education (Grinker, Yeargin-Allsopp, and Doyle 2011). Celebrities no longer hide their mental health challenges; they shine a light on them.4 We have not been able to put it into words, but most of us can sense that something positive is happening. One need only listen to the way millennials speak more openly about their mental suffering. A student in one of my classes described her struggle to find treatment for attention deficit/hyperactivity disorder (ADHD) when she was in high school. Her father told her she did not have ADHD and that she simply was not working hard enough to get good grades. She begged to see a psychiatrist but had to wait until she went to college to act on her own. “Getting diagnosed with ADHD,” she told our class, “was one of the best days of my freshman year because someone actually saw that I wasn’t stupid or lazy, that I just needed treatment to help me do better.” Another student even wore a T-shirt that read “I hate normal people.” In fact, the contemporary term for normal that comes from the neurodiversity movement, “neurotypical,” does not really mean normal. It refers, critically, to people who conform to society’s definition of the normal.

While such acceptance and visibility reflect a change in awareness and education, I would argue that they are epiphenomenal, masking deeper structural and historical forces.

Reframing Mental Illness: From Asylums to Consumer Activism

Capitalism did not cause psychological impairments; rather, psychological impairments acquired new meanings under capitalism. In every era, there were people who were seriously depressed, who had delusions and hallucinations, whose moods swung wildly; there were people who could not speak or, if they did, spoke mostly to themselves, who purposefully injured themselves, who were unable to take care of their daily needs (Dols 1984, 1987). But only during the first industrial revolution did the person addicted to alcohol become an alcoholic, the person who hears voices become a schizophrenic, and so on. Before psychiatry began as a discipline in the late eighteenth century, mental illness was not a category distinct from physical illness (Foucault 1998 [1965]; Hacking 2004). Mental illness—the notion of a distinct group of abnormalities of thought and behavior—is a distinctly modern invention, appearing in Europe and the African colonies in the early nineteenth century, and then in East Asia by the late nineteenth century, as the result of European influence (Gilman 1982; McCulloch 1995; Scull 1979; Swartz 1995; Yoo 2016). The unseemly history of mental illness, including the growth of horrific asylums, illustrates the vital role classification played in the simultaneous emergence of mental illness and stigma.

Most European asylums during the eighteenth and much of the nineteenth century were populated by prostitutes, criminals, drunkards, heretics, and the homeless, often undifferentiated, lumped together as one because they were considered to have one thing in common: lack of reason. Insanity or madness was just one kind of unreason. Commenting on the 1656 opening of the Hôpital Général in Paris for the poor and idle (including those later classified as insane), Foucault says of these heterogeneous residents that “there must have existed a unity which justified its urgency” (Foucault 1998 [1965]:45, 2006). Far before the invention of psychiatry, that unity consisted of people defined not in terms of their individuality but through their “unreasonable” relationship to work and to the economy in the city. As historian Andrew Scull writes, the relationship between urbanization and asylums may not have been simple, but it was certainly constitutive. Rather than attribute the growth of the psy-professions to advances in knowledge, “the main driving force behind the rise of a segregative response to madness (and other forms of deviance) can much more plausibly be asserted to the effects of a mature capitalist market economy” (Scull 2005:29). The market prompted the “abandonment of long-established techniques for coping with the poor and troublesome (including troublesome members of the more affluent classes)” (Scull 2005:29). At the same time, secularization in Europe and North America led to a new emphasis on individual agency and self-reliance. The charitable work of the Christian churches gradually waned too, as, in Marcel Mauss’s words, everyone “was now one’s own priest” with an “inner God” (Mauss 1985 [1938]).

Importantly, as Foucault showed, the asylum did not directly impose new subjectivities but provided a kind of technology that enabled the imposition of the identity of pauper—a new and shameful category of being. In the secular world of science and medicine, mental illnesses became diseases of the mind—evidence of the moral and cognitive failure to regulate oneself. The asylum was not a hospital for treating illness but a separate world of discipline in which administrators used whatever tools they needed—chains, stakes, cages, for example—to subdue and control the “unreasonable” and, to some extent, inhuman, since reason was considered the essence of humanity. The people represented as “the insane” in many European paintings before the nineteenth century (Gilman 1982, 1988) included those defined by their failure to become proper members of the working class, a definition that persisted in the United States well into the twentieth century. Even as recently as 1960, doctors framed the lobotomy, a drastic measure for discipline and control, in economic terms: as disability studies scholar Jenell Johnson notes, “the ultimate indicator of a lobotomy’s success was its ability to return patients to gainful employment” (Johnson 2011:194). Today, the World Health Organization still includes “productive” work in its definition of mental health.5

Historians have criticized Foucault for providing too little evidence for a “great confinement” or for its uniformity throughout Europe (see, e.g., Midelfort 1980). But the critique misses the point since the numbers are irrelevant (Porter 1990). Exclusion was just as powerful as an idea since the concept of confinement was salient. In London’s oldest asylum, Bethlem (Bedlam), there were generally only about three dozen inmates, yet Bedlam figured prominently as an imaginary in art and literature, in parents’ admonitions to their children, in threats to the disorderly (see Cross 2012; Torrey and Miller 2001:11). Even the humanitarian reform known as “moral treatment” was guided by a capitalist imperative. “Moral” for doctors such as Philippe Pinel (1745–1826) meant “psychological,” not material or physical, a kind of tough love to be exercised in the service of potential liberation and employment of the idle. Folie (madness), Pinel wrote, was alienation from the social order caused by primitive, as opposed to civilized, impulses. An expert on eighteenth-century Francophone literature notes “Folie, which until the latter half of the eighteenth century had been understood as the general incapacité … à suivre les rythmes de la vie collective [inability to follow the rhythms of social life], became through Pinel the bona fide medical condition known as aliénation mentale” (Chilcoat 1998:12). This psychological definition was a major break from the past because, in a sense, a specifically “mental” illness had been born.

The emergence of these seemingly objective concepts in Western Europe was paralleled by another development no less crucial to the stigma of mental illness, in which scientists sought to create stable categories of being and difference: the invention of the female. Just as there were no mental illnesses before the advent of psychiatry, neither were there females, since scientists believed, as they had for centuries, that men and women were part of a single sex—male—of which women were just imperfect men (Laqueur 1990). But by 1800, scientists divided men and women into incommensurable, fixed categories, a categorization that was essential for social order in an increasingly industrialized Europe.

The invention of the female was part of the same process of modern classification as was mental illness. In both France and England, scientists now saw women as defined by their bodies, as beings tied more closely to nature than men. Describing the stigma of being female in the early nineteenth century, one historian notes, “In moral discourse there was hardly any overlap between the active, rational, resolute male and the emotional, nurturing, malleable female. The two sexes were essentialized, and woman was constructed as ‘other’ in a more absolute sense than ever before” (Tosh 2005:336). This separation made it even easier for experts to fix stereotypes of femaleness, including a tendency to equate women with mental illness. It was just a short step to associating mental illness with women through the “nature” of their sex (Chilcoat 1998:13), a development that led to the creation of hysteria as a mental illness category, as well as the development of a pathological model of human sexuality.6

Well into the twentieth century, the association of mental illness with idleness extended to people believed to be inherently inferior, especially the members of colonized populations. During World War I, instead of using the stigmatizing term “hysteria” for officers suffering emotional or unexplained bodily distress, British physicians used the less stigmatizing term “shell shock,” whether the soldier saw combat or not. The term denoted an appropriate, understandable male response to stress, rather than anything inherent in the individual (Mosse 2000:101–102). But doctors retained the feminizing term “hysteria” to describe mental illnesses among working-class soldiers, Jews, the Irish, and colonial subjects, most of whom seldom received treatment (Bogacz 1989:230). Doctors in Germany considered non-elites untreatable because they were considered inherently inferior, weak, and “work-shy,” by virtue of their birth and upbringing (Lerner 2003). Emotional distress was proof of their nature; the diagnosis reinforced their inferior position in a gendered hierarchical system of social classification.

In the United States the stigma of the most commonly diagnosed mental illnesses (such as anxiety, affective disorders, and war trauma) waxed and waned through the first half of the twentieth century in relation to World War I, World War II, and the Korean War.7 Stigma tended to decrease when the military created terms like “shell shock” or “war neurosis” and used the dubious strategy of delaying psychiatric diagnosis and treatment to keep soldiers in combat. The stigma of mental illness then increased postwar when the economic costs of chronic sickness strained government budgets. Despite more than a century of concerted attempts by advocates (such as Dorothea Dix during the nineteenth century) and ex-psychiatric patients (like Clifford Beers and Rachel Grant-Smith in the early twentieth century) to resist dominant discourses on mental disability, mental health advocacy emerged in earnest only during the 1960s, along with other kinds of civil rights advocacy. By 1973 gay rights and veterans’ rights leaders, along with ex-patients and young, progressive psychiatrists, were able to persuade the American Psychiatric Association to remove homosexuality from its list of mental disorders and to begin serious consideration of a new concept of post-traumatic stress disorder (Borus 1975; Haley 1974; Kutchins and Kirk 1997; Young 2008).8 But with the exception of those two groups—homosexuals and veterans—the so-called mentally ill were still relatively silenced. The experts still assumed that suffering individuals could not speak for themselves about stigma and discrimination. This assumption was especially salient for individuals with physical disabilities (Epstein 2007; Funk et al. 2006).

Working within the framework of the market, patients adopted an economic language by calling themselves “consumers” or “consumer/survivors” (Tomes 2006) and appealing for the right to be heard less on the basis of their rights to health care as citizens than as consumer-citizens. Patients, condescended to and feeling disempowered by the medical professions (and by what was sometimes referred to as the medical industrial complex), aimed to resist medical paternalism by appropriating the discourse of the market. In the consumer narrative, doctors should not only heal but use their resources rationally; patients should be able to shop not only for doctors but also for providers who practice outside the medical establishment—those providers that physicians want to marginalize from the medical marketplace. Government protections for the patient increasingly came not from constitutional law but from consumer regulation protections (Tomes 2006:86–87). Patient advocacy and federal laws to protect the “mentally disabled” (e.g., the 1963 and 1965 Community Mental Health Centers Acts) together posed a serious threat to psychiatrists’ monopoly on mental health care. Patients who had once been characterized as passive (literally “patient”) and irrational, in contrast to the active, rational doctor, were now not only wise shoppers but people who could represent themselves (Halpern 2004).

The Case of Autism

Doctors once considered psychiatric nosology of little concern to the public at large. As one of the editors of the DSM-IV and DSM-5 told me, “The DSM used to be just for doctors.” After publication of the DSM-III in 1980—the first diagnostic manual to define mental illnesses constitutive of the person and in terms of chronicity rather than as temporary reactions to environmental stressors—psychiatric classification became an integral part of psychiatric training, research, and care. DSM categories became essential to a range of practices, such as advocacy and lobbying, global health outreach, pharmaceutical advertising, and of course, insurance reimbursement for clinical care and other services (Kutchins and Kirk 1997). Just as importantly, the DSM began to resemble a dictionary that provided the public at large with a language of seemingly legitimate and objective mental illness categories.

The extraordinary growth in the number of psychiatric diagnoses in the United States and throughout the world resulted in an apparent “epidemic” not only of autism but other conditions as well, most of which are linked to extensive financial interests. Insanity in the nineteenth century was a financial matter largely for churches and government, but in the twenty-first century the private business income generated by mental illnesses exceeds $135 billion a year, about one-quarter of that amount in prescription drug fees. Mental health care costs constitute about 20% of all spending for physician and clinical services in the United States.9 The US Centers for Disease Control and Prevention (2005) estimates that approximately 11% of the 51 million children ages 4–17 years in the United States have at one time been diagnosed with ADHD and that half of those individuals have at one time taken stimulants as a treatment (e.g., Ritalin). The number of ADHD diagnoses and prescriptions of stimulants is increasing outside the United States as well (Polanczyk et al. 2007). Sales of stimulants for the treatment of ADHD produce more than $12 billion in income for pharmaceutical companies. A similar pattern can be found for adults. At least 9% of Americans have depression, for example, in any given year, and 10% of Americans are taking antidepressant medications, the same proportion of Americans who take statins, the medicines that lower cholesterol. The point is not that children and adults are being overdiagnosed and overtreated—that is a subjective judgment open to a range of interpretations—but that a particular diagnosis became embedded in a financial system that has come to depend on that diagnosis for its sustainability and growth. For example, the FDA recently approved Roche’s drug balovaptan for fast-track clinical trials as a treatment for social communication impairments; other companies are banking on finding breakthrough medications in the near future. Given the absence of any medical intervention, autism is a dream frontier for the pharmaceutical industry.

Because autism is a childhood-onset condition, it falls within the domain of school systems, where the diagnosis thus has had the most financial significance. Between the 2000–2001 and 2010–2011 school years, autism classifications in the American public school system rose by 331%, but the proportion of children in special education programs in the public schools remained static (Polyak, Kubina, and Girirajan 2015). A static special education rate and an increase in autism can occur only if other classifications drop. Indeed, numerous classifications that parents have found uncomfortable if not stigmatizing, such as intellectual disability and specific learning disability, declined as autism became a more common, less frightening, and less shameful diagnosis. The expansion of autism into a spectrum, the decline of mother blame, and the temporary inclusion in the DSM of Asperger’s Disorder as a way to describe people with autism who were intelligent and educable (1994–2013) all reduced stigma and made autism increasingly desirable as a replacement for other diagnoses, especially for children with identifiable genetic syndromes in which autistic features were one part of the syndrome. Some clinicians and researchers now distinguish between nonsyndromic autism (idiopathic) and syndromic autism. For example, syndromic autism is increasingly a term of clinical utility for individuals with Down, Angelman, Cohen, Williams, fragile X, Rett, Cornelia de Lange, 22q11 deletion, and Prader Willi syndromes (Gillberg and Coleman 2000).

Passage of the Individuals with Disabilities Education Act (IDEA) is often credited with expanding educational opportunities for children with special needs, especially children who reside in areas of the United States with lower access to care (Losen and Orfield 2002). But the distribution of resources has perpetuated inequalities by race and class and has also led to an imbalance in autism diagnoses across diverse communities. Autism-related services can sometimes cost twice as much as those for other classifications. In some public schools, autism has increased as a primary diagnosis for children who reside in states in which these costly services are provided. Fiscal incentives and disincentives play an important role in the number of school diagnoses of autism: the more diagnoses, the more money the school receives (Sigafoos et al. 2010). In Texas and California, for example, the provision of financial support for children with autism spectrum disorder (ASD) led to significant increases in special education classifications and autism in particular (Cullen 2003). On the flip side, autism rates fell when those resources were removed (Kwak 2010). Parents fighting for autism-related services often find themselves in a legal quagmire, depending on the availability of support in their districts. Those who can afford it seek legal support for special education litigation. This also contributes to the profitable legal business in autism, as the rate of autism-related suits is disproportionate to enrollment of students with autism in special education programs in the United States (Zirkel 2011).

From classroom aides to speech and language pathologists, child psychiatrists, and vocational trainers, an increasing number of workers now rely on autism for their income and professional identity. In 2013, 20% of the pediatric caseload of private speech and language pathology practitioners was with patients with a diagnosis of autism, and the percentage of those receiving services within schools is probably even higher (ASHA 2012; Brook 2013). Child psychiatrists also perform a crucial function since some school systems require a diagnosis from a board-certified child psychiatrist before agreeing to deliver autism services. There are currently 8,500 board-certified psychiatrists in the United States, and no state in the United States meets the standard (one child psychiatrist per 2,127 children) suggested by the American Academy for Child and Adolescent Psychiatry. If one is lucky enough to get an appointment with a child psychiatrist, the fees can be as high as $600 per hour.

Medical costs for autism interventions involve not only direct expenses incurred from conventional care but also complementary and alternative therapies, many of which are not reimbursable by insurance. These include chelation, hyperbaric oxygen chambers, nutritional therapies, and other treatments based on unproven ideas, such as the hypothesis that autism is caused by chronic bacterial or viral infections, yeast infections, or mercury poisoning (Fitzpatrick 2008). Menus of “biomedical” treatment plans can be found throughout the internet and sometimes include daily regimens of items such as horsetail grass, Ora-Placenta, gold salts, grapeseed extract, fenugreek, milk thistle, and a range of amino acids (Fitzpatrick 2008). Additional costs include hippotherapy, communication devices, and countless trademarked therapies (e.g., SCERTS, Son-Rise, Floortime, etc.). Thirty-one percent of children with ASD use some sort of non-school-based service such as academic tutors, applied behavioral analysis, legal aid, and school observation services and consultants (Lavelle et al. 2014). Buescher et al. (2014) estimate that for the United Kingdom, the average lifetime cost of care for a person with autism is $1.4 million; Leigh and Du (2015) estimate that by the year 2025 the total national cost in the United States for caring for people with autism will exceed $461 billion per year.

Recognizing the potential for the increase in diagnoses of autism to bring financial revenue, universities now offer on campus and online graduate degrees in education or psychology with certification in clinical treatments of autism. There are new PhD programs on autism, programs focusing on autism and play therapy and social skills, and master’s degrees in applied behavior analysis, most of which justify their existence as a necessary response to the increased prevalence of autism. Moreover, the nonprofit sector for autism continues to grow, especially in terms of awareness promotion, the area of philanthropic activity designed specifically to increase the visibility of the symptoms of autism and available services. In 2013, according to tax returns filed with the IRS and available through a GuideStar search,10 the wealthiest 100 nonprofits devoted to autism were worth close to $1 billion in assets and income (e.g., Autism Speaks, the Anderson Center for Autism, and Eden Autism Services). All these developments bespeak the challenges and struggles that parents face as they seek to find services and social supports that will enable their diagnosed children to be educated and launched into the neurotypical world.

Autism Goes to Work

One of the current professional goals of interventions for people with autism is “independent living,” a term first coined in the 1960s by disability rights activists in the United States, launching a “philosophy and a movement of people with disabilities who work for self-determination, equal opportunities and self-respect” (www.independentliving.org). The goal is also for people with disabilities to “show the solutions we want, be in charge of our lives, think and speak for ourselves” (www.independentliving.org).11 The language of the Independent Living Movement influenced the United Nations Convention on the Rights of Persons with Disabilities, which states that its goal is “to enable persons with disabilities to live independently.”

The Independent Living Movement, motivated by a foundational commitment to self-determination, continues to animate disability activism worldwide; yet at times its language has been co-opted and reduced under late capitalism as a hallmark of successful neoliberal existence, a gloss on economic self-sufficiency. This appropriation of “independent living” both embraces and contradicts the original and ongoing meaning of the Independent Living Movement. On the one hand, few would argue against expanding opportunities for meaningful work, a sense of purpose, and integration into community life for all people with disabilities. The more they are able to gain access to the same kinds of practices open to people without disabilities, the easier it will be to reduce stigma. On the other hand, taken too far in certain contexts, this expectation might suggest that a meaningful life is impossible for someone who is not economically productive or able to live on their own, a far cry from the intentions of the activists who catalyzed the Independent Living Movement over a half century ago. Disability activist and scholar Sunaura Taylor offers a cogent critique in her landmark 2004 essay “The Right Not to Work: Power and Disability.” She writes:

The fact is that impairment reveals our interdependence and threatens our belief in our own autonomy. And this is where we return to work: the ultimate sign of an individual’s independence. For many disabled people employment is unattainable. We often simply make inefficient workers, and inefficient is the antithesis of what a good worker should be. For this reason, we are discriminated against by employers. We require what may be pricey adaptations and priceless understanding. Western culture has a very limited idea of what being useful to society is. … Disabled people have to find meaning in other aspects of their lives and this meaning is threatening to our culture’s value system. … The same rule that often excludes the impaired from the traditional workplace also exploits the able-bodied who have no other choice but to participate. The right not to work is an ideal worthy of the impaired and able-bodied alike.

(Taylor 2004)

Similar critiques of the commodification of autism serve as cautionary tales about how categories can be taken up under capitalism in ways that challenge the more inclusive imaginary proposed by activist scholars such as Taylor. For example, Anne McGuire writes of the “autism industrial complex,”12 a phrase she uses to characterize the complex infrastructure for autism, and to criticize the capitalist valuation of the person (McGuire 2013, 2016). The vital economic role of autism is evident in websites like the Autism Speaks Marketplace or Autismthings.com, where autism is branded and commodified and emblazoned on T-shirts, coffee mugs, and other objects. A handful of movie and Broadway theaters are now marketing directly to autistic people and their families, advertising their occasional subsidized “autism-friendly,” “sensory-friendly” performances; for example, the use of strobe lights and pyrotechnics is limited, and quiet spaces with “fidget toys” are provided. Such shows offer a rare welcome for those on the spectrum and their families and allies, accommodating a wide range of autistic behaviors like stimming and vocalizing, with actors prepared for lively if unruly audience participation (Ginsburg and Rapp 2015; Silberman 2015:472).

As Ian Hacking and others have so lucidly described, once a diagnosis takes hold and serves as the hub around which so much wealth, so many people, and activities coalesce, it takes on a life of its own as an authentic, naturalized classification (Hacking 2000). This category, in turn, provides an incentive for manufacturing people with the diagnosis of autism whose presence and needs support this financial infrastructure. The autistic person becomes increasingly defined in the terms of capital. McGuire notes:

The Starbucks cup, World Autism Awareness Day and the sheer breadth of the “autism industrial complex” all gesture towards the cultural fact that, under neoliberal rule, social and/or economic investment in the untimely autistic child is not just an investment in the realization of the “future-citizen-worker” but in the potential for its realization. In one unbroken—and clearly very lucrative—move, our market-driven times, at once, produce and regulate, create and constrain conducts that are beyond the norm.

(2016:124)

British autism researcher Bonnie Evans (2017a, 2017b) offers a new twist on autism diagnosis in the context of capitalism in the United Kingdom. She suggests that the government promoted autism diagnoses not just to facilitate service delivery but because these diagnoses provided a way to justify the absence of certain kinds of individuals from the workforce as governments were dismantling social welfare programs—that is, the state could argue that autism, and not government policies, was to blame for much of the unemployment in England. She draws on Nikolas Rose’s characterization of neoliberalism as involving the shifting focus of government action from society to individuals (Rose 1998). Autism, Evans says, “grew up as a kind of resistance to a neoliberal agenda, a tool for sheltering certain people” who might otherwise fall through the cracks (2017a).

A more persuasive argument for the increased popularity of autism, however, is that at least some autistic individuals now fit better with our economy and society than ever before. Advocates, especially parents, may have motivated the delivery of government-funded services, but far from separating and sheltering autistic people from the economy, the services have in many cases served to integrate people with autism and other disabilities into it, the goal of many activists. In places where autistic people used to be hidden away, like India, Korea, and South Africa, people with autistic children are now publicly insisting on their rights to be full citizens. Increasingly, autism is understood less in terms of lack and more as distinctiveness or eccentricity, if not talent and genius.

In large part the result of the writings of autistic individuals, many of them activists (see, e.g., the works of Temple Grandin and Donna Williams), scholars are chipping away at the assumptions of deficit by identifying rationality, coherence, logic, creativity, and metaphor where these were assumed to be absent (Costa and Grinker 2018; Draaisma 2009; Savarese 2018). Following Biklen (2005), Hacking (2009), and others (see Osteen 2008), Costa and Grinker (2018), for example, draw on phenomenology and philosophy of mind in an analysis of first-person accounts of autism by Sean Barron (Barron and Barron 2002), Lucy Blackman (Blackman 2001), Carly Fleischmann (Fleischmann 2012), Naoki Higashida (Higashida 2013), Tito Mukhopadhyay (Mukhopadhyay 2011), Stephen Shore (Shore 2003), and Daniel Tammet (Tammet 2006) to challenge long-standing assumptions about the nature of autistic cognitive impairment. In psychology and allied disciplines, researchers are detailing new kinds of sociality (Frith 1989; Happé, Briskman, and Frith 2001; Hobson 2014). By listening to the voices of people with autism—Temple Grandin, Ari Ne’eman, Lydia X. Z. Brown, Deej Savarese, and Amanda/Mel Baggs, to name just a few leaders in the neurodiversity movement—researchers are seeking to find a balance between the depersonalized knowledge constructed by science and individual claims for knowledge that had been previously silenced or dismissed as anecdotes. As Jenell Johnson writes (2011), the person with a mental illness used to be almost completely without voice or volition. And if the person spoke or wrote—memoir, fiction, and poetry—what she said was often valued only as evidence of the illness, as when doctors analyze the writings of a patient with schizophrenia, looking only for examples of irrationality and disorder. “To be disabled mentally,” disability scholar Catherine Prendergast notes, “is to be disabled rhetorically,” not because one inherently lacks the ability due to the disease but because it has been taken away by society (Prendergast 2001:45). It is as if when you have a disability you can no longer mean what you say.13 Clearly, times have changed.

Technologies have made new forms of sociality possible for many who in the past may have been isolated; verbal and nonverbal individuals can now use technology to build and maintain meaningful relationships and even new social identities. Beyond social media and online chat groups are new kinds of employment that require extraordinary memory for details about narrow topics and the ability to detect visual and mathematical patterns. Such skills are highly advantageous for computer programming, software development, and other areas of basic science. For this reason, Temple Grandin once described NASA as the largest sheltered workshop in the country. This same argument has led some to wonder if autistic people are responsible for more than we ever imagined. As one journalist wrote, paraphrasing an earlier comment by Grandin, “For all we know, the first tools on earth might have been developed by a loner sitting at the back of the cave, chipping at thousands of rocks to find the one that made the sharpest spear, while the neurotypicals chattered away in the firelight” (Silberman 2001:5). In addition, in response to the tireless efforts of families and self-advocates, employers and schools now offer varied environments for disabled workers, including the neurodiverse, most of them created in response to parent and autistic self-advocacy efforts. These include sensory-friendly environments, telecommuting, and autism-friendly performances of theater and films. Nonetheless, people on the spectrum are imagined to be consumers of a booming industry of “geek culture” such as memorabilia and literature related to Star Trek, Dr. Who, Star Wars, computer or hand-drawn animation (anime), and activities like Comicon and Cosplay. Of course, this comes with a risk of stereotyping. It is important to remember the now widely circulated sobriquet: “If you’ve met one person with autism, you’ve met one person with autism” (interview with Shore [Lime Connect 2019]).

In April 2017, 50 large corporations such as JP Morgan, Ford Motor Company, Ernst and Young, and numerous high-tech businesses met in Silicon Valley to talk about ways to hire more adults with autism. A German software company, SAP, hosted the event and talked about how over the past 5 years it had hired 128 people on the autism spectrum. The initiatives were launched not only at the request of employees who have family members with autism but by those who have other disabilities, for example, executives such as Jim Sinocchi at JP Morgan Chase, who is a wheelchair user, and Jenny Lay-Flurrie at Microsoft, who is deaf. Both JP Morgan Chase and Microsoft have programs to hire autistic workers, and some smaller companies are following suit.14 The executives I interviewed following the meeting insist this new openness to inclusion and support of neurodiversity is not a rejection of the capitalist sink-or-swim ideology. Nor do they see their role as a replacement for government services or interventions. Simply put, they are competing for labor.

Michael Fieldhouse, an executive at DXS, a cybersecurity offshoot of Hewlett-Packard, told me, “I’ve talked to colleagues at places like Marks and Spencer, the food company in England, mining executives from BHK in Canada looking for people with good visualization skills, and the leaders at Freddie Mac, and we all agreed that the demand and supply equations were out of whack in some talent pools, especially those we needed at DXC.” James Mahoney, who directs the Autism at Work Program at JP Morgan Chase, insists the initiative was not born of sentiments like compassion and generosity. “We never said, ‘Let’s do the right thing and be charitable.’” For Mahoney, fighting stigma certainly has nothing to do with pity, which is simply stigma clothed as compassion. “We never said we had jobs for people on the autism spectrum. We said, ‘We want talented people and maybe there is a group of talented people we’re not hiring.’” For both Fieldhouse and Mahoney, the “normalization” of autism in the economy is a response to the labor market. Inspired by the Danish company Specialisterne, which was founded specifically in order to hire autistic software engineers, Mahoney created a separate autism recruiting track that focuses more on job skills so that people with social skills deficits do not get eliminated prematurely in the interview process. As Mahoney put it, “An interviewer might write that the applicant was socially awkward, made poor eye contact, and gave long and rambling answers, and then end the interview without ever discovering that the person is an incredible Java coder.”15

The majority of the workforce at Rising Tide, a car wash company in Miami cleaning approximately 160,000 cars a year, is autistic. My daughter has found a job she loves caring for laboratory animals in research settings. At both Rising Tide and in the laboratory where my daughter works, the highly repetitive tasks are well suited to many autistic persons’ skills and enjoyment. Autism can become an exemplar of how work and productivity can decrease social exclusion and increase social interactions, but only if we can reconfigure our expectations about what constitutes a valuable life under capitalism.

Conclusion

As the case of autism suggests, stigma decreases when a condition affects us all, when we all exist on a spectrum, with more or less of a certain set of features. With autism, as with many medical diagnoses—like hypertension and obesity—the boundary lines are drawn more by culture than by nature. A spectrum simultaneously presents an opportunity for people to negotiate their subjectivities more freely and to challenge the diagnostic stability and chronicity that so often characterizes stigma. The spectrum is also an invitation. It asks us to join the rest of the world on a continuum of suffering. It asks us to say, along with neurodiversity advocates, that both normality and abnormality are fictional lands no one actually inhabits. This was Freud’s great hope: that by showing that we are all neurotic we might understand that we are all afflicted in some way. To his credit, Goffman attempted to do this when he said that the so-called normal have little more than a shaky advantage since everyone has some form of difference to be protected from social approbation. This new condition of normality may be the one Andrew Solomon describes in his book Far from the Tree: it is difference, not homogeneity, that unites us. While writing about schizophrenia, autism, deafness, and dwarfism, among other conditions, Solomon realized something important about himself—that as a gay man with a history of serious depression and suicidality, he is more normal than abnormal. “The exceptional is ubiquitous,” he writes. “To be entirely typical is the rare and lonely state” (Solomon 2013:4). Solomon’s perspective may help people to become more empathic and may motivate broader participation in conversations about mental health and disability rights. For example, in autism research, scientists have shown that mild symptoms of autism are common in the general population and that family members of a child with autism often exhibit isolated and subtle autistic traits. With these insights in mind, where might stigma begin and end?

Conventional wisdom holds that stigma is universal—humans evolved the capacity to stigmatize as a way to protect themselves from dangerous individuals. Yet I argue that we are not hard-wired to exclude people who are atypical. There is nothing natural about any particular kind of shame, alienation, and discrimination. These are attitudes we have to learn within our communities. If stigma is ahistorical, it not only is stripped of its cultural history but becomes resistant to change. If stigma is universal and ineluctable, then it becomes a fact as objective as air and water, and then the questions of variability across the globe and in history lose meaning. Imagine if we used the word “stigma” less often, or if the word did not exist. Perhaps we could then confront the specific ways a society brands and excludes those who do not conform and understand that those processes are inseparable from our culture, history, and the possibilities twenty-first-century capitalism offers. Culture put stigma and mental illness together, so we can surely take them apart.

I thank Faye Ginsburg, Rayna Rapp, Danilyn Rutherford, and Laurie Obbink for their leadership in organizing the 2018 Wenner-Gren conference “Disability Worlds” at which an earlier version of this paper was presented. I am grateful to all the conference participants for their insightful criticisms and commentary, especially Michele Friedner, Faye Ginsburg, Cassandra Hartblay, Laurence Ralph, Rayna Rapp, and Tyler Zoanni. Mackenzie Fusco and Wayne Zhang provided first-rate research assistance. I take responsibility for all errors.

Notes

1.  Goffman writes that if one seeks to tabulate the number of people who suffer from stigma, including those related to the stigmatized who experience “courtesy stigma,” the question becomes “not whether a person has experience with a stigma of his own, because he has, but rather how many varieties he has had his own experience with” (Goffman 1963:129). To make this point, Goffman added a passage that many still find disturbing to read: “There is only one complete unblushing male in America: a young, married, white, urban, northern, heterosexual Protestant father of college education, fully employed, of good complexion, weight, and height, and a recent record in sports” (1963:128).

2.  National Trends in Disability Employment (nTIDE), https://www.kesslerfoundation.org/content/ntide-january-2018-jobs-report-americans-disabilities-kick-new-year-sharp-gains-labor-market.

3.  In fact, there are some economists who argue that the ADA is actually to blame for a decline in the employment of people with disabilities during the 1990s (see, e.g., Stapleton and Burkhauser 2003). But it should also be pointed out that such estimates are for what economists refer to as “measured employment” and not volunteer work or family care (e.g., stay-at-home parents).

4.  Lady Gaga and Prince William have disclosed their emotional struggles, she with post-traumatic stress from an assault, he from depression. David Letterman’s psychiatrist joined him onstage in 2017 when he received the Kennedy Center’s Mark Twain Prize for American Humor. Positive autistic characters are present in children’s and adult television. There is an autistic Sesame Street character, an autistic Power Ranger, and other protagonists with obvious autistic traits in the films Mary and Max and Adam and in television shows such as The Good Doctor, The Big Bang Theory, Silicon Valley, Community, Hannibal, The Bridge, Sherlock, House, Mr. Robot, and The Walking Dead, among others. The Tony Award–winning plays Dear Evan Hansen and Curious Incident of the Dog in the Nighttime featured characters with severe social anxiety and autism, respectively.

5.  http://www.who.int/features/factfiles/mental_health/en/.

6.  For Emil Kraepelin, the great classifier of psychoses, “symptoms of mental illness could be seen not only in women but in undeveloped ‘wild tribal people’ with their demonic and magical beliefs, in children, with their ‘spineless submission,’ and in women with their propensity for extremes of excitement, their volatile mood, and lack of self-control” (Barrett 1996:211).

7.  In their book War Neuroses in North Africa (1943), which was known as the “bible” of military psychiatry (Jaffe 2014:139; Shephard 2000:213), Roy Grinker and John Spiegel, in effect, normalized mental illnesses for a time. For them, the more interesting question was not why these soldiers became sick but why so many didn’t become sick. Stigma was reserved for those who failed to enlist or were incarcerated for crimes and severe mental illnesses, including sexual pathology. The New York Times (Laurence 1944:36) prematurely claimed that Grinker and Spiegel eliminated the stigma of mental illness.

8.  There is a similar push today by transgender advocates to eradicate any category of gender identity disorder from the DSM, though it is unlikely to be successful since removing it would eliminate the possibility for transgendered individuals to get access to reimbursable medical services, such as hormone therapy and surgery (Byne et al. 2012; Reed et al. 2016).

9.  US Centers for Medicare and Medicaid Services, https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/NationalHealthExpendData/downloads/highlights.pdf.

10.  GuideStar is a well-regarded source of information and evaluation on nonprofit organizations. https://www.guidestar.org/ (accessed June 24, 2019).

11.  As activist Adolf Ratzka explains: “Independent Living does not mean that we want to do everything by ourselves, do not need anybody or like to live in isolation. Independent Living means that we demand the same choices and control in our every-day lives that our non-disabled brothers and sisters, neighbors and friends take for granted. We want to grow up in our families, go to the neighborhood school, use the same bus as our neighbors, work in jobs that are in line with our education and interests, and raise families of our own. We are profoundly ordinary people sharing the same need to feel included, recognized and loved.” https://www.independentliving.org/.

12.  The concept comes from former US President Dwight Eisenhower, who after leaving office in 1961 expressed concern about the “military industrial complex,” the symbiotic relationship between American industry and military. That symbiosis insulated the military and its corporate partners from the normal operations of the competitive market, since that relationship was to a great extent shielded by the government and shielded from the actions and interests of the wider American public (Ledbetter 2011).

13.  Echoing Prendergast, autistic self-advocate and founder of the Autistic Self-Advocacy Network (ASAN), Ari Ne’eman, tells the story of his organization’s interviews with the media to describe their anger over the New York University Child Study Center “Ransom Notes” ads. The ads suggested that autism and other conditions were criminals who had kidnapped children from their parents (Kras 2010). Journalists had a difficult time comprehending that Ne’eman and his colleagues were actually autistic, since they were representing themselves. One writer who reported on her interview with Ne’eman and ASAN filed a story on UPI under the title “Ads Anger Parents of Autistic Children”—but neither Ne’eman nor the other interviewees were parents of autistic children, let alone parents (R. R. Grinker interview with Ne’eman, August 1, 2018). See “Ads Anger Parents of Autistic Children,” United Press International, December 14, 2007.

14.  See, e.g., “The Growing Acceptance of Autism in the Workplace,” CBS News, https://www.cbsnews.com/news/the-growing-acceptance-of-autism-in-the-workplace/, and also “These Major Tech Companies Are Making Autism Hiring a Priority,” Monster.com, https://www.monster.com/career-advice/article/autism-hiring-initiatives-tech.

15.  Interviews with Michael Fieldhouse (May 8, 2019) and James Mahoney (September 28, 2018).

References Cited