Despite the persistent myth that Silicon Valley was built by rogue engineers in Palo Alto garages, federal funding — especially from the military — has long been the real developmental engine of the American technology sector. It was robust government spending in science, technology, computation, and higher education that fueled the explosion of American technology after World War II. And these same federal powers eventually rescued the sector when, after Nixon cut the federal defense budget amid the wider economic strife of the early 1970s, it experienced its first major bust.
In response, electronics manufacturers formed the nationwide Semiconductor Industry Association a few years later, lobbying Washington for tech-friendly policies like lower taxes, less regulation, and protection from the thriving Japanese chip sector. Their influence grew when Reagan reversed Nixon’s military cuts in the ’80s. After Reagan announced a new effort to build a missile shield in space, the Pentagon allocated additional funding to encourage research in technologies like microelectronics, self-driving automobiles, fighter jets, and artificial intelligence. Soon, tech was booming again, with investor buzz around personal computing steadily infusing capital into the Silicon Valley ecosystem. The following decade saw exuberant expansion of commercial and residential internet usage, along with new startups, V.C. investment, and press lauding the explosion of the information superhighway. Between 1990 and 1997, the share of American households with personal computers increased from 15 percent to 35 percent, and in the late ’90s, Jeff Bezos and Intel CEO Andy Grove were both named Time magazine’s “Person of the Year.” Big Tech had successfully leveraged its growing power in Washington to enter the American home.