The ten-page, lightly footnoted memo called “Google’s Ideological Echo Chamber” first circulated on the tech giant’s internal networks, then leaked and went viral early in August. Its author, James Damore, a software engineer working at Google’s headquarters in Mountain View, California, focused his screed on the “factual inaccuracy” of the company’s diversity trainings, in which employees are “regularly told that implicit (unconscious) and explicit biases are holding women back in tech and leadership.” To this the 28-year-old programmer resolutely dissented. “We need to stop assuming that gender gaps imply sexism,” he wrote. “The distribution of preferences and abilities of men and women differ in part due to biological causes.” Devoted, by his own accounting, only to “facts and reason” that could “shed light on these biases,” Damore believed his rational goodwill to be stymied on ideological grounds. “Google’s left bias has created a politically correct monoculture that maintains its hold by shaming dissenters into silence,” he complained. “If we can’t have an honest discussion about this, then we can never truly solve the problem.”
Damore was fired. “The whole point of my memo,” he told Bloomberg TV later, “was to improve Google and Google’s culture, and they just punished me and shamed me for doing it . . . I’m not a sexist.” Many scholars, including evolutionary biologists, publicly debunked his claims to scientific objectivity, but this didn’t limit his reach: Damore’s dismissal transformed him into the right’s latest darling. For his first public interview, he sat down with far-right podcaster and YouTube personality Stefan Molyneux. Not long after, the engineer asked alt-right mouthpiece Mike Cernovich to retweet a photo of him wearing a T-shirt altering Google’s logo to say goolag. Another photo circulated of him holding a sign that read Fired for Truth. Damore’s appeal grew quickly: David Brooks defended him in the New York Times as merely “tapping into the long and contentious debate about genes and behavior” and called for Google CEO Sundar Pichai’s resignation, arguing that Pichai had acted with “moral absolutism” like “the mobs we’ve seen on a lot of college campuses.”
It was the fringe that found itself particularly fired up. Conspiracy theorist and self-described “Republican political operative” Jack Posobiec took the lead in organizing a nationwide March on Google, claiming that “Real Americans are sick of Big Tech’s crackdown on free speech and we’re taking to the streets.” The march was scheduled for August 19, but organizers canceled in the aftermath of the white-nationalist violence in Charlottesville that caused the death of an anti-racism protestor. Posobiec claimed “credible” threats to his march from “known alt-left terrorist groups”—though authorities said they had not received any such reports—and posted online that he hoped he and his followers could reschedule soon.
Although the plan was to hold rallies in front of all Google offices, the focus was held on the headquarters, in Mountain View, at the center of Silicon Valley, from where this monstrous animating threat of progressive, censoring Big Tech was said to have originated. Some pundits pointed out that the idea of a right-wing march in placid Silicon Valley—situated in an area known for many years as The Valley of Heart’s Delight, where the tepid weather lacks even the urgency of seasonal change—seemed on the surface novel and rather strange. “The conditions are a far cry from just a few years ago,” read an article in the Los Angeles Times, “when technologists were thought of as political neophytes—the descendants of a counterculture that turned hippies into billionaires like Steve Jobs. If there was a political streak in the valley, it tended to be libertarian.”
This narrative of driven hippies lucky enough to strike it rich has somehow never been entirely deflated, even though the area’s strong military presence long provided revenue for tech research and development, and the local industry’s great show of iconoclasm—from slogans like Apple’s “Think different” and Google’s “Don’t be evil” to the broad sartorial insistence on hoodie-slacker chic—has often fronted a most vacuous brand of far-reaching corporatism. And the mythology persists despite the fact that virulent culture-war debate is not actually new in Mountain View. It has been part of Silicon Valley’s story from the beginning.
Two and a half miles west from the bayside Googleplex sits 391 San Antonio Road, where until recently a small metal plaque embedded in the sidewalk marked the former site of Shockley Semiconductor Laboratory. At this location in 1956, Dr. William Shockley started the first silicon device research and manufacturing company in the valley, the plaque read. The advanced research and ideas developed here led to the development of Silicon Valley. Though the lab moved in 1961, the city and state passed on designating it a landmark for decades. Officials refused any kind of recognition until 1998, when the plaque was installed after long municipal discussion. But it was removed a few years ago to make way for one of the many large and moneyed developments that have come to define the life and landscape of the region, once mostly marked by its orchards, now inundated by tech wealth endlessly on the rise.
Why the ambivalence? Shockley was a founding father of Silicon Valley—a Nobel Prize–winning physicist who somewhat literally put Mountain View on the map. The city was so unknown when he set up shop there in 1956 that he had to describe it to friends as the “southern edge of Palo Alto.” But Shockley left a distressing legacy, and he’s a complicated historical figure to remember with a monument—particularly in California, where people prefer to imagine the future rather than reckon with the past. (“In California,” Joan Didion once put it, “we did not believe that history could bloody the land, or even touch it.”) In the last two and a half decades of his life, Shockley turned away from technology and began promoting the idea that intelligence was biologically determined—with blacks cognitively subordinate to whites—arguing under the auspices of estimable science that without forced sterilization of those with inferior intelligence, the world would be plunged into a dysgenic panic. He claimed to his death in 1989 that he was acting in the service of mankind, according to statistical principles, and was constrained only by the limits of intellectual decorum. That is, he might have said, by a “politically correct monoculture that maintains its hold by shaming dissenters into silence.”
This began in the sixties; the name Silicon Valley saw its first use in print in 1971. A reporter named Don C. Hoefler, who had overheard the phrase used in passing at industry events, gave to a column in Electronic News the title “Silicon Valley, USA,” writing that in Shockley “lies the genesis of the San Francisco silicon story.” It should have been a moment of incipient canonization, an overture to Shockley’s assured tech hagiography. But by 1971 this “silicon story” was no longer even the central story of his life.
Shockley was from the area, raised from the age of 3 in shady Palo Alto, near Stanford University, where his family moved in 1913. He was troubled even in childhood, angry and antisocial.1 When he was 8 years old, he first encountered quantified intelligence and came out then on the wrong end: he was tested by Stanford psychology professor Lewis Terman, who had recently developed the Stanford-Binet IQ test and was beginning to collect students for his Genetic Studies of Genius. To be accepted into the study, a student “genius” had to score 135 or higher on the test; Shockley scored a 129, then was tested again the next year and scored a 125. (His mother, May, a polymath who had attended Stanford and is believed to have been the first person to climb Mt. Whitney solo, was tested in 1919—and recorded a 161.)
Shockley left for Pasadena in 1928 to study physics at Caltech, arriving the year after the discipline had been tilted on its head by Werner Heisenberg’s uncertainty principle. He went on to grad school at MIT, driving east with his close friend Frederick Seitz, who later reported that on the drive Shockley revealed a looming sense of programmatic elitism: “He was inclined to believe that society should be governed by a vaguely defined intellectually elite group, rather than by a majority rule as in a democratic society.” At the time, Seitz didn’t think much of it.
After MIT Shockley found his way, as did many brilliant young physicists, to the epoch-making Bell Telephone Laboratories, the research wing of AT&T, at the forefront of telecommunications research, where he encountered silicon in 1940 while working on radar technology. He interrupted his career rise there only briefly, during World War II, to serve as a scientific adviser to the State Department, offering statistical analyses to improve bombing efficiency. In July 1945, he was asked to assess costs and benefits of a United States invasion of Japan, calculating that US forces “shall probably have to kill at least five to ten million Japanese” to secure victory, which “might cost us between 1.7 and 4 million casualties, including 400,000 to 800,000 killed.” After atomic bombs were dropped on Hiroshima and Nagasaki two weeks later, Shockley was asked again for assessment. He estimated the bombs were “profitable by a factor of fifty; that is, the cost to Japan was fifty times the cost to us.” He limited his idea of cost—even in the case of nuclear war—to economic effect and property damage.
Shockley’s postwar return to Bell Labs led to a life-changing discovery. Tasked with supervising a thirty-four-man team to create a solid-state telephone switch, Shockley oversaw the research of John Bardeen and Walter Brattain, who in December, 1947 invented a device that could be used to amplify or switch electronic signals, called a point-contact transistor. The men reported their invention, which used the element germanium, to their supervisor. Shockley took it as an affront—he had not been directly involved—and decamped to his home office, locking himself in his study for weeks to improve upon the finding, finally developing something better: a junction transistor, smaller and easier to produce. Even more important to the egotistical Shockley, it was something he could call his own.
This was a major breakthrough, but no one today lives in Germanium Valley; in 1954 Texas Instruments fabricated the first transistor using silicon. The following year, angry at internal politics at Bell, Shockley was ready to leave and raise money for a laboratory of his own, securing funding from Beckman Instruments, run by physicist and inventor Arnold Beckman. Though that company was based in Southern California, Shockley set his lab near his hometown, where his mother still lived. He found space first in Menlo Park (north toward San Francisco), then, in 1956, moved the lab a few minutes south to Mountain View, into a Quonset-style building on San Antonio Road that had once housed an apricot-packing operation. Later that same year came propitious news: Shockley had, along with Bardeen and Brattain, won the Nobel Prize in physics for their transistor invention. Though Shockley had piggybacked on the work of men he supervised, none of the press around the invention had indicated exactly who did which part when; Shockley was treated as an originating inventor as much as the others.
What happened next is often told as a piece of local lore. Shockley was a terrible manager—paranoid, obsessive, controlling, intractable—and before long a group of the engineers he’d hired met with Beckman and threatened to walk if things didn’t change. Beckman doubled down on his support of the Nobelist, and the group, whom Shockley reportedly called the Traitorous Eight, left the lab, secured their own funding, and opened Fairchild Semiconductor, which soon produced the first commercially viable integrated circuit and found the success that was supposed to be Shockley’s. Electrical-engineered products using silicon-based technology spread through Santa Clara Valley. Eventually, some of these men—known to history as Fairchilden—moved on to form many of the valley’s influential companies, notably Intel. Shockley started spending more time in his office at Stanford, which he owed to his friendship with Fred Terman, son of Lewis, who served as provost there. (Terman, who created a high-tech research park near the campus, is today considered the other founding father of Silicon Valley.)
As the industry passed him by, Shockley shifted his focus away from his field of expertise. In 1964 he gave a talk at the Planned Parenthood League of Alameda County about recently developed fears of “a predominance of the people who can produce the most offspring,” arguing that the problem should be “one of the highest priorities for scientific study by our ablest scholars.” The following year, he was invited to Gustavus Adolphus College in Minnesota for a Nobel conference called “Genetics and the Future of Man.” Here he told of a trip he’d taken to India, where he witnessed a severely overcrowded population, and he described a story he’d read in the newspaper, about a white deli owner in San Francisco who’d been blinded by acid in an assault; the perpetrator was a young black man whose family, including a dozen-plus siblings, were all on welfare. This demonstrated, Shockley argued in his remarks, “a situation in which an irresponsible individual could produce offspring at a rate which might be four times greater than those of more responsible members of society”—that is, “a form of evolution in reverse.” And he gave the summation that would drive the rest of his life: “There is no reason to doubt that genetic probability laws apply to human intelligence and emotional traits.”
Later that year he clarified his views to U.S. News & World Report. “We lack proper scientific investigations,” he said, “possibly because nobody wants to raise the question for fear of being called a racist.” When the article ran, colleagues of Shockley’s at Stanford wrote a letter dismissing his claims as “pseudo-scientific justification for class and race prejudice”; his ideas, they said, were “so hackneyed that we would not ordinarily have cared to react” if not for his “standing as a Nobel Laureate and as a colleague at Stanford.” But Shockley would not relent. He asked the National Academy of Sciences, of which his old friend Fred Seitz was now president, to take up the issue. A group of geneticists spent a year conferring before reporting back that it was too complicated, too socially determined, too difficult to isolate all variables. “To shy away from seeking the truth is one thing,” their report read; “to restrain from collecting still more data that would be of uncertain meaning but would invite misuse is another.”
While the developments of Silicon Valley saw widening publicity in 1971, Shockley published several papers, including one in the Review of Educational Research in which he described data that could suggest “an increase of 1 percent in Caucasian ancestry raises Negro IQ an average of one point for low IQ populations.” This was a mistake to ignore, argued Shockley: “To fail to use this method of diagnosis for fear of being called racist is irresponsible.” He traveled the country to spread his message, presenting as good-faith research an argument that scientific consensus dismissed as irresponsible and meritless, willfully muddling causation with correlation, fomenting racist ideology while claiming to be courageously standing up to an undeserved epithet. In 1973, while at Princeton University for a scheduled debate, Shockley was confronted by a mixed-race crowd of about 400 young people who protested outside the auditorium, burned Shockley in effigy, and chanted “No Nobel Prize for genocide!” That same year he told the New Scientist that however “nobly intended” welfare programs were, they were sure to enforce “a destiny of genetic enslavement for the next generation of blacks.” He was only trying to help, he said, and called himself “the intellectual in America most likely to reduce Negro agony in the next generation.”
In 1980 Shockley gave an interview to Playboy, hoping to clarify these supposed good intentions. It was perhaps his largest audience to date. “Some things that are called prejudice, which are based on sound statistics, really shouldn’t be called prejudice,” he told the magazine. He chose his metaphors poorly, to say the least. “It might be easier to think in terms of breeds and dogs,” he said. “There are some breeds that are temperamental, unreliable, and so on. One might then regard such a breed in a somewhat less favorable light than other dogs.”
That year was big for Silicon Valley—Apple went public in December 1980, generating more capital than any IPO since Ford’s in 1956, the year Shockley had opened his lab, when the idea of a local tech company making so many millionaires must have been inconceivable. But by then William Shockley was out of pocket. His interview became infamous. It still circulates, lending a Nobelist’s esteem to hate groups. In April of this year a photo of a few of Shockley’s responses was shared on Twitter by Fashy Haircut, the account of Nathan Damigo, founder of the California-based white-supremacist group Identity Evropa. “Black students who get into college certainly have equal rights to learn” and “the extra advantages of remedial courses,” reads one response, but they
won’t be able to make the most of them. They can reasonably conclude that something phony in the system is frustrating them. When society endeavors to enforce the equality of achievement by methods like these, then the result may be sort of induced paranoia on the part of blacks.
Damigo, best known for punching an anti-racism protester in Berkeley in May and for coplanning the white-supremacist rally in Charlottesville, underlined the last sentence. “William Shockley predicted #BLM back in 1980 during his interview with @Playboy,” he tweeted to his followers.
“I’m not a sexist,” James Damore claimed. Shockley once argued that his intention was only “to promote raceology, the study of racial problems and trends from a scientific point of view, and this approach is quite different from racism.” When confronted with the flimsiness of their arguments, both men hid behind claims of mistreatment by an intolerant public. So maligned did Shockley feel in this regard that he claimed “there is a significant parallel between the attitude of German intellectuals in Hitler’s day and our intellectuals’ unwillingness to rise to the dysgenic threat.” And so obsessed was he with IQ and intelligence—so enamored of empirical data absent moral purchase—that he claimed the Nazis simply got eugenics wrong: they targeted “the most intellectually advanced segment of their population” with a program that “was anti-Jewish.” Shockley was shameless: “In that they made a mistake, in my opinion.”
Nobody could convince Shockley to stop. When he fell out of the headlines, he wrote that he would manipulate the press, using “the First Amendment as a line upon which I shall endeavor to exert a force so as to deflect the rudder of public opinion and turn the ship of civilization away from the genetic storm I fear is rising over the horizon of the future.” He was the only Nobel winner publicly known to give sperm to the “genius bank,” as the Escondido-based Repository for Germinal Choice was often referred to, serving himself up to be parodied on Saturday Night Live by Rodney Dangerfield in a sketch called “Dr. Shockley’s House of Sperm.” He ran a miserable campaign in the Republican primary for California’s open US Senate seat in 1982 as the “anti-dysgenic candidate”; the KKK was so eager to rally for him that only threats from his lawyer could keep the group away. He commissioned a novel that he hoped would be the Uncle Tom’s Cabin of the fight against dysgenics, to be titled Huntsville: A Journey into Darkness, of which only a few chapters were ever written. In 1984 he won a libel case against the Atlanta Journal-Constitution, having complained about their comparison of his views to Hitler’s; his reputation was so destroyed that he was awarded only $1 in damages.
When Shockley died five years later, he was estranged from family and friends; no funeral was held. But those he worked with continued to charge forward with his cause. His friend Arthur Jensen, a UC Berkeley evolutionary biologist with whom Shockley had collaborated on research into race-specific intelligence testing, continued the work into the 21st century. Another correspondent and friend, Richard Herrnstein, cowrote The Bell Curve: Intelligence and Class Structure in American Life with the political scientist Charles Murray. (That book was published to tremendous attention in 1994, thanks in part to Andrew Sullivan’s decision to run it as a cover story in the New Republic; it has since been widely discredited.) Meanwhile, Shockley’s message found followers among far-right figures like Damigo and on the neo-Nazi site Stormfront, which once proposed a William Shockley Day to coincide with Martin Luther King Jr. Day. “Shockley said what Jensen was unwilling to say: blacks shouldn’t breed,” wrote one Stormfront forum commenter about ten years ago. “My problem with Shockley is the transistor.”
But as central as his racial beliefs were to his life’s work, it’s not hard to find histories of 20th-century engineering that go out of their way to give these views broader purchase, framing them in scientific method. “It is easy to say Shockley was a racist,” read a 2007 piece in Electric Engineering Journal, “and his views were enthusiastically embraced by many who were indeed out and out racists. It is probably more accurate to say that he was an elitist.” Research publisher Springer put out a History of Semiconductor Engineering the same year that also hedged. “Shockley asked questions that no one wanted to ask, much less answer,” wrote engineer Bo Lojek.
Nobody really proved that Shockley was wrong with his eugenics opinions; however, also nobody joined him in support. He gave to his detractors easy ammunition against him and it was not too difficult for an incoming “politically correct” society to minimize his contribution to mankind.
Lojek might as well have written that only “facts and reason” could “shed light on these biases.”
Haunting these historical treatments—as well as Damore’s memo—is Shockley’s broader intellectual legacy, which lurks on, insidiously, in the contours of Silicon Valley. It rests on Shockley’s rigid conviction in the rightness of his own views, in the pure empiricism of what he saw as the cold facts, in an understanding of ethics as derived from data sets and allowed to be deaf to any outside concern. Shockley was certain not only that he alone was correct but that society was simply too close-minded and moralistic to listen. He kept in his files a letter from a friend that compared criticism of his research into race and intelligence to the persecution Galileo Galilei experienced under the Inquisition. It’s that sense that lingers on in the scene: that one’s arguments are a truth to be set free by the hard-fought march of scientific progress, that the only thing clouding the better judgment of those who might understand is a mess of arbitrary social restrictions as oppressive as a medieval inquisition.
Damore is not without supporters in his Silicon Valley community, where white men remain largely in power, where belief in pure meritocracy runs indomitable and deep. His arguments echoed loudly among those who might consider themselves part of that “intellectually elite group” a grad school–bound Shockley once described to his friend Fred Seitz. This set includes not only other Google employees who agreed with Damore on internal listservs; it also comprises the outside engineers who chorused on Reddit and 4chan that his case testified to the need of anti-diversity causes like Gamergate. And the echo reached powerful venture capitalists and respected tech entrepreneurs who found themselves caught between a corporate impulse to accede to public outcry and an iconoclastic desire, powerful in their industry, to say and do what no one else would dare. But this desire to cast aside consensus views—to see individual thinking as always superior to social understanding or collective goals—permits those possessed of sufficient arrogance and faith in bad methodology to demean the validity and function of moral values in the service of an amorality being redefined as some purer truth. It brings to mind something the philosopher Hans Jonas once wrote about the search for ethics in an age of technology: “We need wisdom most when we believe in it the least.”
Damore did not quote Shockley, but his words echo so closely that it’s not hard to imagine. “As soon as we start to moralize an issue,” wrote Damore in his memo, “we stop thinking about it in terms of costs and benefits, dismiss anyone that disagrees as immoral, and harshly punish those we see as villains to protect the ‘victims.’” It’s no doubt a local logic—but one whose costs need new assessment.
Shockley’s life is well told in a biography by Joel N. Shurkin called Broken Genius and also in Crystal Fire: The Birth of the Information Age, by Michael Riordan and Lillian Hoddeson. ↩