Chapter OneLost ValleySilicon Valley has lost its way.
The initial rise of the American software industry was made possible in the first part of the twentieth century by what would seem today to be a radical and fraught partnership between emerging technology companies and the U.S. government. Silicon Valley’s earliest innovations were driven not by technical minds chasing trivial consumer products but by scientists and engineers who aspired to see the most powerful technology of the age deployed to address challenges of industrial and national significance. Their pursuit of breakthroughs was intended not to satisfy the passing needs of the moment but rather to drive forward a much grander project, channeling the collective purpose and ambition of a nation. This early dependence of Silicon Valley on the nation-state and indeed the U.S. military has for the most part been forgotten, written out of the region’s history as an inconvenient and dissonant fact—one that clashes with the Valley’s conception of itself as indebted only to its capacity to innovate.
In the 1940s, the federal government began supporting an array of research projects that would culminate in the development of novel pharmaceutical compounds, intercontinental rockets, and spy satellites, as well as the precursors to artificial intelligence. Indeed, Silicon Valley once stood at the center of American military production and national security. Fairchild Camera and Instrument Corporation, whose semiconductor division was founded in Mountain View, California, and made possible the first primitive personal computers, built reconnaissance equipment for spy planes used by the Central Intelligence Agency in the late 1950s. For a time after World War II, all of the U.S. Navy’s ballistic missiles were produced in Santa Clara County, California. Companies such as Lockheed Missile & Space, Westinghouse, Ford Aerospace, and United Technologies had thousands of employees working in Silicon Valley on weapons production through the 1980s and into the 1990s.
This union of science and the state in the middle part of the twentieth century arose in the wake of World War II. In November 1944, as Soviet forces closed in on Germany from the east and Adolf Hitler prepared to abandon his Wolf’s Lair, or Wolfsschanze, his eastern front headquarters in the north of present-day Poland, President Franklin Roosevelt was in Washington, D.C., already contemplating an American victory and the end of the conflict that had remade the world. Roosevelt sent a letter to Vannevar Bush, the son of a pastor who would later become the head of the U.S. Office of Scientific Research and Development. Bush was born in 1890 in Everett, Massachusetts, just north of Boston. His father had grown up, as did generations of his family before him, in Provincetown at the far end of Cape Cod. In the letter, Roosevelt described “the unique experiment” that the United States had undertaken during the war to leverage science in service of military ends. Roosevelt anticipated the next era—and an emerging partnership between national government and private industry—with precision. He wrote that there is “no reason why the lessons to be found in this experiment”—that is, directing the resources of an emerging scientific establishment to help wage the most significant and violent war that the world had ever known—“cannot be profitably employed in times of peace.” His ambition was clear. Roosevelt intended to see that the machinery of the state—its power and prestige, as well as the financial resources of the newly victorious nation and emerging hegemon—would spur the scientific community forward in service of, among other things, the advancement of public health and national welfare. The challenge was to ensure that the engineers and researchers who had directed their attention to the industry of war—and particularly the physicists, who as Bush noted had “been thrown most violently off stride”—could shift their efforts back to civilian advances in an era of relative peace.
The entanglement of the state and scientific research both before and after the war was itself built on an even longer history of connection between innovation and politics. Many of the earliest leaders of the American republic were themselves engineers, from Thomas Jefferson, who designed sundials and studied writing machines, to Benjamin Franklin, who experimented with and constructed everything from lightning rods to eyeglasses. Franklin was not someone who dabbled in science. He was an engineer, one of the most productive in the century, who happened to become a politician. Dudley Herschbach, a Harvard professor and chemist, has observed that the Founding Father’s research into electricity “was recognized as ushering in a scientific revolution comparable to those wrought by Newton in the previous century or by Watson and Crick in ours.” For Jefferson, science and natural history were his “passion,” he wrote in a letter to a federal judge in Kentucky in 1791, while politics was his “duty.” Some fields were so new that nonspecialists could aspire to make plausible contributions to them. James Madison dissected an American weasel and took nearly forty measurements of the animal in order to compare it with European varieties of the species, as part of an investigation into a theory, advanced by the French naturalist Georges-Louis Leclerc in the late eighteenth century, that animals in North America had degenerated into smaller and weaker versions of their counterparts across the ocean.
Unlike the legions of lawyers who have come to dominate American politics in the modern era, many early American leaders, even if not practitioners of science themselves, were nonetheless remarkably fluent in matters of engineering and technology. John Adams, the second president of the United States, by one historian’s account was focused on steering the early republic away from “unprofitable science, identifiable in its focus on objects of vain curiosity,” and toward more practical forms of inquiry, including “applying science to the promotion of agriculture.” The innovators of the eighteenth and nineteenth centuries were often polymaths whose interests diverged wildly from the contemporary expectation that depth, as opposed to breadth, is the most effective means of contributing to a field. The term “scientist” itself was only coined in 1834, to describe Mary Somerville, a Scottish astronomer and mathematician; prior to that, the blending of pursuits across physics and the humanities, for instance, was so commonplace and natural that a more specialized word had not been needed. Many had little regard for the boundary lines between disciplines, ranging from areas of study as seemingly unrelated as linguistics to chemistry, and zoology to physics. The frontiers and edges of science were still in that earliest stage of expansion. As of 1481, the library at the Vatican, the largest in Europe, had only thirty-five hundred books and documents. The limited extent of humanity’s collective knowledge made possible and encouraged an interdisciplinary approach that would almost be certain to stall an academic career today. That cross-pollination, as well as the absence of a rigid adherence to the boundaries between disciplines, was vital to a willingness to experiment, and to the confidence of political leaders to opine on engineering and technical questions that implicated matters of government.
The rise of J. Robert Oppenheimer and dozens of his colleagues in the late 1930s only further situated scientists and engineers at the heart of American life and the defense of the democratic experiment. Joseph Licklider, a psychologist whose work at the Massachusetts Institute of Technology anticipated the rise of early forms of AI, was hired in 1962 by the organization that would become the U.S. Defense Advanced Research Projects Agency—an institution whose innovations would include the precursors to the modern internet as well as the global positioning system. His research for his now classic paper “Man-Computer Symbiosis,” which was published in March 1960 and sketched a vision of the interplay between computing intelligence and our own, was supported by the U.S. Air Force. There was a closeness, and significant degree of trust, in the relationships between political leaders and the scientists on whom they relied for guidance and direction. Shortly after the launch by the Soviet Union of the satellite Sputnik in October 1957, Hans Bethe, the German-born theoretical physicist and adviser to President Dwight D. Eisenhower, was called to the White House. Within an hour, there was agreement on a path forward to reinvigorate the American space program. “You see that this is done,” Eisenhower told an aide. The pace of change and action in that era was swift. NASA was founded the following year.
Copyright © 2025 by Alexander C. Karp. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.